Abstract: Coagulation of water involves the use of coagulating
agents to bring the suspended matter in the raw water together for
settling and the filtration stage. Present study is aimed to examine the
effects of aluminum sulfate as coagulant in conjunction with Moringa
Oleifera Coagulant Protein as coagulant aid on turbidity, hardness,
and bacteria in turbid water. A conventional jar test apparatus was
employed for the tests. The best removal was observed at a pH of 7
to 7.5 for all turbidities. Turbidity removal efficiency was resulted
between % 80 to % 99 by Moringa Oleifera Coagulant Protein as
coagulant aid. Dosage of coagulant and coagulant aid decreased with
increasing turbidity. In addition, Moringa Oleifera Coagulant Protein
significantly has reduced the required dosage of primary coagulant.
Residual Al+3 in treated water were less than 0.2 mg/l and meets the
environmental protection agency guidelines. The results showed that
turbidity reduction of % 85.9- % 98 paralleled by a primary
Escherichia coli reduction of 1-3 log units (99.2 – 99.97%) was
obtained within the first 1 to 2 h of treatment. In conclusions,
Moringa Oleifera Coagulant Protein as coagulant aid can be used for
drinking water treatment without the risk of organic or nutrient
release. We demonstrated that optimal design method is an efficient
approach for optimization of coagulation-flocculation process and
appropriate for raw water treatment.
Abstract: In this work we develop an object extraction method
and propose efficient algorithms for object motion characterization.
The set of proposed tools serves as a basis for development of objectbased
functionalities for manipulation of video content. The
estimators by different algorithms are compared in terms of quality
and performance and tested on real video sequences. The proposed
method will be useful for the latest standards of encoding and
description of multimedia content – MPEG4 and MPEG7.
Abstract: Coal fly ash (CFA) generated by coal-based thermal
power plants is mainly composed of quartz, mullite, and unburned
carbon. In this study, the effect of unburned carbon on CFA toward
its adsorption capacity was investigated. CFA with various carbon
content was obtained by refluxing it with sulfuric acid having various
concentration at various temperature and reflux time, by heating at
400-800°C, and by sieving into 100-mesh in particle size. To
evaluate the effect of unburned carbon on CFA toward its adsorption
capacity, adsorption of methyl violet solution with treated CFA was
carried out. The research shows that unburned carbon leads to
adsorption capacity decrease. The highest adsorption capacity of
treated CFA was found 5.73 x 10-4mol.g-1.
Abstract: Process-oriented software development is a new
software development paradigm in which software design is modeled
by a business process which is in turn translated into a process
execution language for execution. The building blocks of this
paradigm are software units that are composed together to work
according to the flow of the business process. This new paradigm
still exhibits the characteristic of the applications built with the
traditional software component technology. This paper discusses an
approach to apply a traditional technique for software component
fabrication to the design of process-oriented software units, called
process components. These process components result from
decomposing a business process of a particular application domain
into subprocesses, and these process components can be reused to
design the business processes of other application domains. The
decomposition considers five managerial goals, namely cost
effectiveness, ease of assembly, customization, reusability, and
maintainability. The paper presents how to design or decompose
process components from a business process model and measure
some technical features of the design that would affect the
managerial goals. A comparison between the measurement values
from different designs can tell which process component design is
more appropriate for the managerial goals that have been set. The
proposed approach can be applied in Web Services environment
which accommodates process-oriented software development.
Abstract: The development and extension of large cities induced
a need for shallow tunnel in soft ground of building areas. Estimation
of ground settlement caused by the tunnel excavation is important
engineering point. In this paper, prediction of surface subsidence
caused by tunneling in one section of seventh line of Tehran subway
is considered. On the basis of studied geotechnical conditions of the
region, tunnel with the length of 26.9km has been excavated applying
a mechanized method using an EPB-TBM with a diameter of 9.14m.
In this regard, settlement is estimated utilizing both analytical and
numerical finite element method. The numerical method shows that
the value of settlement in this section is 5cm. Besides, the analytical
consequences (Bobet and Loganathan-Polous) are 5.29 and 12.36cm,
respectively. According to results of this study, due tosaturation of
this section, there are good agreement between Bobet and numerical
methods. Therefore, tunneling processes in this section needs a
special consolidation measurement and support system before the
passage of tunnel boring machine.
Abstract: Speed estimation is one of the important and practical tasks in machine vision, Robotic and Mechatronic. the availability of high quality and inexpensive video cameras, and the increasing need for automated video analysis has generated a great deal of interest in machine vision algorithms. Numerous approaches for speed estimation have been proposed. So classification and survey of the proposed methods can be very useful. The goal of this paper is first to review and verify these methods. Then we will propose a novel algorithm to estimate the speed of moving object by using fuzzy concept. There is a direct relation between motion blur parameters and object speed. In our new approach we will use Radon transform to find direction of blurred image, and Fuzzy sets to estimate motion blur length. The most benefit of this algorithm is its robustness and precision in noisy images. Our method was tested on many images with different range of SNR and is satisfiable.
Abstract: This paper investigates the issue of building decision
trees from data with imprecise class values where imprecision is
encoded in the form of possibility distributions. The Information
Affinity similarity measure is introduced into the well-known gain
ratio criterion in order to assess the homogeneity of a set of
possibility distributions representing instances-s classes belonging to
a given training partition. For the experimental study, we proposed an
information affinity based performance criterion which we have used
in order to show the performance of the approach on well-known
benchmarks.
Abstract: In this paper, we have proposed a low cost optimized solution for the movement of a three-arm manipulator using Genetic Algorithm (GA) and Analytical Hierarchy Process (AHP). A scheme is given for optimizing the movement of robotic arm with the help of Genetic Algorithm so that the minimum energy consumption criteria can be achieved. As compared to Direct Kinematics, Inverse Kinematics evolved two solutions out of which the best-fit solution is selected with the help of Genetic Algorithm and is kept in search space for future use. The Inverse Kinematics, Fitness Value evaluation and Binary Encoding like tasks are simulated and tested. Although, three factors viz. Movement, Friction and Least Settling Time (or Min. Vibration) are used for finding the Fitness Function / Fitness Values, however some more factors can also be considered.
Abstract: This paper introduces and studies new indexing techniques for content-based queries in images databases. Indexing is the key to providing sophisticated, accurate and fast searches for queries in image data. This research describes a new indexing approach, which depends on linear modeling of signals, using bases for modeling. A basis is a set of chosen images, and modeling an image is a least-squares approximation of the image as a linear combination of the basis images. The coefficients of the basis images are taken together to serve as index for that image. The paper describes the implementation of the indexing scheme, and presents the findings of our extensive evaluation that was conducted to optimize (1) the choice of the basis matrix (B), and (2) the size of the index A (N). Furthermore, we compare the performance of our indexing scheme with other schemes. Our results show that our scheme has significantly higher performance.
Abstract: This paper introduces an approach to construct a set of criteria for evaluating alternative options. Content analysis was used to collet criterion elements. Then the elements were classified and organized yielding to hierarchic structure. The reliability of the constructed criteria was evaluated in an experiment. Finally the criteria were used to evaluate alternative options indecision-making.
Abstract: Society has grown to rely on Internet services, and the
number of Internet users increases every day. As more and more
users become connected to the network, the window of opportunity
for malicious users to do their damage becomes very great and
lucrative. The objective of this paper is to incorporate different
techniques into classier system to detect and classify intrusion from
normal network packet. Among several techniques, Steady State
Genetic-based Machine Leaning Algorithm (SSGBML) will be used
to detect intrusions. Where Steady State Genetic Algorithm (SSGA),
Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and
Zeroth Level Classifier system are investigated in this research.
SSGA is used as a discovery mechanism instead of SGA. SGA
replaces all old rules with new produced rule preventing old good
rules from participating in the next rule generation. Zeroth Level
Classifier System is used to play the role of detector by matching
incoming environment message with classifiers to determine whether
the current message is normal or intrusion and receiving feedback
from environment. Finally, in order to attain the best results,
Modified SSGA will enhance our discovery engine by using Fuzzy
Logic to optimize crossover and mutation probability. The
experiments and evaluations of the proposed method were performed
with the KDD 99 intrusion detection dataset.
Abstract: This study analyzed environmental health risks and
people-s perceptions of risks related to waste management in poor
settlements of Abidjan, to develop integrated solutions for health and
well-being improvement. The trans-disciplinary approach used relied
on remote sensing, a geographic information system (GIS),
qualitative and quantitative methods such as interviews and a
household survey (n=1800). Mitigating strategies were then
developed using an integrated participatory stakeholder workshop.
Waste management deficiencies resulting in lack of drainage and
uncontrolled solid and liquid waste disposal in the poor settlements
lead to severe environmental health risks. Health problems were
caused by direct handling of waste, as well as through broader
exposure of the population. People in poor settlements had little
awareness of health risks related to waste management in their
community and a general lack of knowledge pertaining to sanitation
systems. This unfortunate combination was the key determinant
affecting the health and vulnerability. For example, an increased
prevalence of malaria (47.1%) and diarrhoea (19.2%) was observed
in the rainy season when compared to the dry season (32.3% and
14.3%). Concerted and adapted solutions that suited all the
stakeholders concerned were developed in a participatory workshop
to allow for improvement of health and well-being.
Abstract: In this study we applied thermal lens (TL) technique
to study the effect of size on thermal diffusivity of cadmium sulphide
(CdS) nanofluid prepared by using γ-radiation method containing
particles with different sizes. In TL experimental set up a diode laser
of wavelength 514 nm and intensity stabilized He-Ne laser were used
as the excitation source and the probe beam respectively,
respectively. The experimental results showed that the thermal
diffusivity value of CdS nanofluid increases when the of particle size
increased.
Abstract: The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the
new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.
Abstract: Traffic Management and Information Systems, which rely on a system of sensors, aim to describe in real-time traffic in urban areas using a set of parameters and estimating them. Though the state of the art focuses on data analysis, little is done in the sense of prediction. In this paper, we describe a machine learning system for traffic flow management and control for a prediction of traffic flow problem. This new algorithm is obtained by combining Random Forests algorithm into Adaboost algorithm as a weak learner. We show that our algorithm performs relatively well on real data, and enables, according to the Traffic Flow Evaluation model, to estimate and predict whether there is congestion or not at a given time on road intersections.
Abstract: To improve the classification rate of the face
recognition, features combination and a novel non-linear kernel are
proposed. The feature vector concatenates three different radius of
local binary patterns and Gabor wavelet features. Gabor features are
the mean, standard deviation and the skew of each scaling and
orientation parameter. The aim of the new kernel is to incorporate
the power of the kernel methods with the optimal balance between
the features. To verify the effectiveness of the proposed method,
numerous methods are tested by using four datasets, which are
consisting of various emotions, orientations, configuration,
expressions and lighting conditions. Empirical results show the
superiority of the proposed technique when compared to other
methods.
Abstract: Ren et al. presented an efficient carrier frequency offset
(CFO) estimation method for orthogonal frequency division multiplexing
(OFDM), which has an estimation range as large as the
bandwidth of the OFDM signal and achieves high accuracy without
any constraint on the structure of the training sequence. However,
its detection probability of the integer frequency offset (IFO) rapidly
varies according to the fractional frequency offset (FFO) change. In
this paper, we first analyze the Ren-s method and define two criteria
suitable for detection of IFO. Then, we propose a novel method for
the IFO estimation based on the maximum-likelihood (ML) principle
and the detection criteria defined in this paper. The simulation results
demonstrate that the proposed method outperforms the Ren-s method
in terms of the IFO detection probability irrespective of a value of
the FFO.
Abstract: Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.
Abstract: Modeling of a heterogeneous industrial fixed bed
reactor for selective dehydrogenation of heavy paraffin with Pt-Sn-
Al2O3 catalyst has been the subject of current study. By applying
mass balance, momentum balance for appropriate element of reactor
and using pressure drop, rate and deactivation equations, a detailed
model of the reactor has been obtained. Mass balance equations have
been written for five different components. In order to estimate
reactor production by the passage of time, the reactor model which is
a set of partial differential equations, ordinary differential equations
and algebraic equations has been solved numerically.
Paraffins, olefins, dienes, aromatics and hydrogen mole percent as
a function of time and reactor radius have been found by numerical
solution of the model. Results of model have been compared with
industrial reactor data at different operation times. The comparison
successfully confirms validity of proposed model.
Abstract: The paper reviews the relationship between spatial
and transportation planning in the Southern African Development
Community (SADC) region of Sub-Saharan Africa. It argues that
most urbanisation in the region has largely occurred subsequent to
the 1950s and, accordingly, urban development has been
profoundly and negatively affected by the (misguided) spatial and
institutional tenets of modernism. It demonstrates how a
considerable amount of the poor performance of these settlements
can be directly attributed to this. Two factors in particular about the
planning systems are emphasized: the way in which programmatic
land-use planning lies at the heart of both spatial and transportation
planning; and the way on which transportation and spatial planning
have been separated into independent processes. In the final
section, the paper identifies ways of improving the planning
system. Firstly, it identifies the performance qualities which
Southern African settlements should be seeking to achieve.
Secondly, it focuses on two necessary arenas of change: the need to
replace programmatic land-use planning practices with structuralspatial
approaches; and it makes a case for making urban corridors
a spatial focus of integrated planning, as a way of beginning the
restructuring and intensification of settlements which are currently
characterised by sprawl, fragmentation and separation