Abstract: The chemical degradation of dieldrin in ferric
sulfide and iron powder aqueous suspension was investigated
in laboratory batch type experiments. To identify the reaction
mechanism, reduced copper was used as reductant. More than
90% of dieldrin was degraded using both reaction systems after
29 days. Initial degradation rate of the pesticide using ferric
sulfide was superior to that using iron powder. The reaction
schemes were completely dissimilar even though the ferric ion
plays an important role in both reaction systems. In the case of
metallic iron powder, dieldrin undergoes partial dechlorination.
This reaction proceeded by reductive hydrodechlorination with
the generation of H+, which arise by oxidation of ferric iron.
This reductive reaction was accelerated by reductant but
mono-dechlorination intermediates were accumulated. On the
other hand, oxidative degradation was observed in the reaction
with ferric sulfide, and the stable chemical structure of dieldrin
was decomposed into water-soluble intermediates. These
reaction intermediates have no chemical structure of drin class.
This dehalogenation reaction assumes to occur via the adsorbed
hydroxyl radial generated on the surface of ferric sulfide.
Abstract: In this paper, we present a preconditioned AOR-type iterative method for solving the linear systems Ax = b, where A is a Z-matrix. And give some comparison theorems to show that the rate of convergence of the preconditioned AOR-type iterative method is faster than the rate of convergence of the AOR-type iterative method.
Abstract: Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: The major problem that wireless communication
systems undergo is multipath fading caused by scattering of the
transmitted signal. However, we can treat multipath propagation as
multiple channels between the transmitter and receiver to improve
the signal-to-scattering-noise ratio. While using Single Input
Multiple Output (SIMO) systems, the diversity receivers extract
multiple signal branches or copies of the same signal received from
different channels and apply gain combining schemes such as Root
Mean Square Gain Combining (RMSGC). RMSGC asymptotically
yields an identical performance to that of the theoretically optimal
Maximum Ratio Combining (MRC) for values of mean Signal-to-
Noise-Ratio (SNR) above a certain threshold value without the need
for SNR estimation. This paper introduces an improvement of
RMSGC using two different issues. We found that post-detection and
de-noising the received signals improve the performance of RMSGC
and lower the threshold SNR.
Abstract: This document details the process of developing a
wireless device that captures the basic movements of the foot (plantar
flexion, dorsal flexion, abduction, adduction.), and the knee
movement (flexion). It implements a motion capture system by using
a hardware based on optical fiber sensors, due to the advantages in
terms of scope, noise immunity and speed of data transmission and
reception. The operating principle used by this system is the detection
and transmission of joint movement by mechanical elements and
their respective measurement by optical ones (in this case infrared).
Likewise, Visual Basic software is used for reception, analysis and
signal processing of data acquired by the device, generating a 3D
graphical representation in real time of each movement. The result is
a boot in charge of capturing the movement, a transmission module
(Implementing Xbee Technology) and a receiver module for
receiving information and sending it to the PC for their respective
processing.
The main idea with this device is to help on topics such as
bioengineering and medicine, by helping to improve the quality of
life and movement analysis.
Abstract: It is known that the heart interacts with and adapts to
its venous and arterial loading conditions. Various experimental
studies and modeling approaches have been developed to investigate
the underlying mechanisms. This paper presents a model of the left
ventricle derived based on nonlinear stress-length myocardial
characteristics integrated over truncated ellipsoidal geometry, and
second-order dynamic mechanism for the excitation-contraction
coupling system. The results of the model presented here describe the
effects of the viscoelastic damping element of the electromechanical
coupling system on the hemodynamic response. Different heart rates
are considered to study the pacing effects on the performance of the
left-ventricle against constant preload and afterload conditions under
various damping conditions. The results indicate that the pacing
process of the left ventricle has to take into account, among other
things, the viscoelastic damping conditions of the myofilament
excitation-contraction process.
Abstract: Building intelligent traffic guide systems has been an
interesting subject recently. A good system should be able to observe
all important visual information to be able to analyze the context of
the scene. To do so, signs in general, and traffic signs in particular,
are usually taken into account as they contain rich information to
these systems. Therefore, many researchers have put an effort on
sign recognition field. Sign localization or sign detection is the most
important step in the sign recognition process. This step filters out
non informative area in the scene, and locates candidates in later
steps. In this paper, we apply a new approach in detecting sign
locations using a new color invariant model. Experiments are carried
out with different datasets introduced in other works where authors
claimed the difficulty in detecting signs under unfavorable imaging
conditions. Our method is simple, fast and most importantly it gives
a high detection rate in locating signs.
Abstract: Electrical Discharge Machine (EDM) is especially
used for the manufacturing of 3-D complex geometry and hard
material parts that are extremely difficult-to-machine by conventional
machining processes. In this paper authors review the research work
carried out in the development of die-sinking EDM within the past
decades for the improvement of machining characteristics such as
Material Removal Rate, Surface Roughness and Tool Wear Ratio. In
this review various techniques reported by EDM researchers for
improving the machining characteristics have been categorized as
process parameters optimization, multi spark technique, powder
mixed EDM, servo control system and pulse discriminating. At the
end, flexible machine controller is suggested for Die Sinking EDM to
enhance the machining characteristics and to achieve high-level
automation. Thus, die sinking EDM can be integrated with Computer
Integrated Manufacturing environment as a need of agile
manufacturing systems.
Abstract: In this paper, in order to investigate the effects of
photovoltaic system introduction to detached houses in Japan, two
kinds of works were done. Firstly, the hourly generation amount of a
4.2kW photovoltaic system were simulated in 46 cities to investigate
the potential of the system in different regions in Japan using a
simulation model of photovoltaic system. Secondly, based on the
simulated electricity generation amount, the energy saving, the
environmental and the economic effect of the photovoltaic system
were examined from hourly to annual timescales, based upon
calculations of typical electricity, heating, cooling and hot water
supply load profiles for Japanese dwellings. The above analysis was
carried out using a standard year-s hourly weather data for the
different city provided by the Expanded AMeDAS Weather Data
issued by AIJ (Architectural Institute of Japan).
Abstract: This paper applies fuzzy set theory to evaluate the
service quality of online auction. Service quality is a composition of
various criteria. Among them many intangible attributes are difficult
to measure. This characteristic introduces the obstacles for respondent
in replying to the survey. So as to overcome this problem, we
invite fuzzy set theory into the measurement of performance. By
using AHP in obtaining criteria and TOPSIS in ranking, we found
the most concerned dimension of service quality is Transaction
Safety Mechanism and the least is Charge Item. Regarding to the
most concerned attributes are information security, accuracy and
information.
Abstract: This paper presents reliability evaluation techniques
which are applied in distribution system planning studies and
operation. Reliability of distribution systems is an important issue in
power engineering for both utilities and customers. Reliability is a
key issue in the design and operation of electric power distribution
systems and load. Reliability evaluation of distribution systems has
been the subject of many recent papers and the modeling and
evaluation techniques have improved considerably.
Abstract: To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.
Abstract: Data mining is the process of sifting through large
volumes of data, analyzing data from different perspectives and
summarizing it into useful information. One of the widely used
desktop applications for data mining is the Weka tool which is
nothing but a collection of machine learning algorithms implemented
in Java and open sourced under the General Public License (GPL). A
web service is a software system designed to support interoperable
machine to machine interaction over a network using SOAP
messages. Unlike a desktop application, a web service is easy to
upgrade, deliver and access and does not occupy any memory on the
system. Keeping in mind the advantages of a web service over a
desktop application, in this paper we are demonstrating how this Java
based desktop data mining application can be implemented as a web
service to support data mining across the internet.
Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.
Abstract: This paper aims to develop an algorithm of finite
capacity material requirement planning (FCMRP) system for a multistage
assembly flow shop. The developed FCMRP system has two
main stages. The first stage is to allocate operations to the first and
second priority work centers and also determine the sequence of the
operations on each work center. The second stage is to determine the
optimal start time of each operation by using a linear programming
model. Real data from a factory is used to analyze and evaluate the
effectiveness of the proposed FCMRP system and also to guarantee a
practical solution to the user. There are five performance measures,
namely, the total tardiness, the number of tardy orders, the total
earliness, the number of early orders, and the average flow-time. The
proposed FCMRP system offers an adjustable solution which is a
compromised solution among the conflicting performance measures.
The user can adjust the weight of each performance measure to
obtain the desired performance. The result shows that the combination
of FCMRP NP3 and EDD outperforms other combinations
in term of overall performance index. The calculation time for the
proposed FCMRP system is about 10 minutes which is practical for
the planners of the factory.
Abstract: One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.
Abstract: This study was to investigate the performance of
hybrid solvents blended between primary, secondary, or tertiary
amines and piperazine (PZ) for CO2 removal from flue gas in terms
of CO2 absorption capacity and regeneration efficiency at 90 oC.
Alkanolamines used in this work were monoethanolamine (MEA),
diethanolamine (DEA), and triethanolamine (TEA). The CO2
absorption was experimentally examined under atmospheric pressure
and room temperature. The results show that MEA blend with PZ
provided the maximum CO2 absorption capacity of 0.50 mol
CO2/mol amine while TEA provided the minimum CO2 absorption
capacity of 0.30 mol CO2/mol amine. TEA was easier to regenerate
for both first cycle and second cycle with less loss of absorption
capacity. The regeneration efficiency of TEA was 95.09 and 92.89 %,
for the first and second generation cycles, respectively.