Abstract: A human verification system is presented in this
paper. The system consists of several steps: background subtraction,
thresholding, line connection, region growing, morphlogy, star
skelatonization, feature extraction, feature matching, and decision
making. The proposed system combines an advantage of star
skeletonization and simple statistic features. A correlation matching
and probability voting have been used for verification, followed by a
logical operation in a decision making stage. The proposed system
uses small number of features and the system reliability is
convincing.
Abstract: Lanthanum oxide is to be recovered from monazite,
which contains about 13.44% lanthanum oxide. The principal
objective of this study is to be able to extract lanthanum oxide from
monazite of Moemeik Myitsone Area. The treatment of monazite in
this study involves three main steps; extraction of lanthanum
hydroxide from monazite by using caustic soda, digestion with nitric
acid and precipitation with ammonium hydroxide and calcination of
lanthanum oxalate to lanthanum oxide.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: ICA which is generally used for blind source separation
problem has been tested for feature extraction in Speech recognition
system to replace the phoneme based approach of MFCC. Applying
the Cepstral coefficients generated to ICA as preprocessing has
developed a new signal processing approach. This gives much better
results against MFCC and ICA separately, both for word and speaker
recognition. The mixing matrix A is different before and after MFCC
as expected. As Mel is a nonlinear scale. However, cepstrals
generated from Linear Predictive Coefficient being independent
prove to be the right candidate for ICA. Matlab is the tool used for
all comparisons. The database used is samples of ISOLET.
Abstract: In this paper, a wavelet-based neural network (WNN) classifier for recognizing EEG signals is implemented and tested under three sets EEG signals (healthy subjects, patients with epilepsy and patients with epileptic syndrome during the seizure). First, the Discrete Wavelet Transform (DWT) with the Multi-Resolution Analysis (MRA) is applied to decompose EEG signal at resolution levels of the components of the EEG signal (δ, θ, α, β and γ) and the Parseval-s theorem are employed to extract the percentage distribution of energy features of the EEG signal at different resolution levels. Second, the neural network (NN) classifies these extracted features to identify the EEGs type according to the percentage distribution of energy features. The performance of the proposed algorithm has been evaluated using in total 300 EEG signals. The results showed that the proposed classifier has the ability of recognizing and classifying EEG signals efficiently.
Abstract: Optimal reactive power flow is an optimization problem
with one or more objective of minimizing the active power losses for
fixed generation schedule. The control variables are generator bus
voltages, transformer tap settings and reactive power output of the
compensating devices placed on different bus bars. Biogeography-
Based Optimization (BBO) technique has been applied to solve
different kinds of optimal reactive power flow problems subject
to operational constraints like power balance constraint, line flow
and bus voltages limits etc. BBO searches for the global optimum
mainly through two steps: Migration and Mutation. In the present
work, BBO has been applied to solve the optimal reactive power
flow problems on IEEE 30-bus and standard IEEE 57-bus power
systems for minimization of active power loss. The superiority of the
proposed method has been demonstrated. Considering the quality of
the solution obtained, the proposed method seems to be a promising
one for solving these problems.
Abstract: A 7-step method (with 25 sub-steps) to assess risk of
air pollutants is introduced. These steps are: pre-considerations,
sampling, statistical analysis, exposure matrix and likelihood, doseresponse
matrix and likelihood, total risk evaluation, and discussion
of findings. All mentioned words and expressions are wellunderstood;
however, almost all steps have been modified, improved,
and coupled in such a way that a comprehensive method has been
prepared. Accordingly, the SADRA (Statistical Analysis-Driven Risk
Assessment) emphasizes extensive and ongoing application of
analytical statistics in traditional risk assessment models. A Sulfur
Dioxide case study validates the claim and provides a good
illustration for this method.
Abstract: A one-step conservative level set method, combined with a global mass correction method, is developed in this study to simulate the incompressible two-phase flows. The present framework do not need to solve the conservative level set scheme at two separated steps, and the global mass can be exactly conserved. The present method is then more efficient than two-step conservative level set scheme. The dispersion-relation-preserving schemes are utilized for the advection terms. The pressure Poisson equation solver is applied to GPU computation using the pCDR library developed by National Center for High-Performance Computing, Taiwan. The SMP parallelization is used to accelerate the rest of calculations. Three benchmark problems were done for the performance evaluation. Good agreements with the referenced solutions are demonstrated for all the investigated problems.
Abstract: There are various approaches to implement quality
improvements. Organizations aim for a management standard which
is capable of providing customers with quality assurance on their
product/service via continuous process improvement. Carefully
planned steps are necessary to ensure the right quality improvement
methodology (QIM) and business operations are consistent, reliable
and truly meet the customers' needs. This paper traces the evolution
of QIM in Malaysia-s Information Technology (IT) industry in the
past, current and future; and highlights some of the thought of
researchers who contributed to the science and practice of quality,
and identifies leading methodologies in use today. Some of the
misconceptions and mistakes leading to quality system failures will
also be examined and discussed. This paper aims to provide a general
overview of different types of QIMs available for IT businesses in
maximizing business advantages, enhancing product quality,
improving process routines and increasing performance earnings.
Abstract: In the area where the high quality water is not
available, unconventional water sources are used to irrigate.
Household leachate is one of the sources which are used in dry and
semi dry areas in order to water the barer trees and plants. It meets
the plants needs and also has some effects on the soil, but at the same
time it might cause some problems as well. This study in order to
evaluate the effect of using Compost leachate on the density of soil
iron in form of a statistical pattern called ''Split Plot'' by using two
main treatments, one subsidiary treatment and three repetitions of the
pattern in a three month period. The main N treatments include:
irrigation using well water as a blank treatments and the main I
treatments include: irrigation using leachate and well water
concurrently. Some subsidiary treatments were DI (Drop Irrigation)
and SDI (Sub Drop Irrigation). Then in the established plots, 36
biannual pine and cypress shrubs were randomly grown. Two months
later the treatment begins. The results revealed that there was a
significant variation between the main treatment and the instance
regarding pH decline in the soil which was related to the amount of
leachate injected into the soil. After some time and using leachate the
pH level fell, as much as 0.46 and also increased due to the great
amounts of leachate. The underneath drop irrigation ends in better
results than sub drop irrigation since it keeps the soil texture fixed.
Abstract: In this paper, an improved edge detection algorithm
based on fuzzy combination of mathematical morphology and
wavelet transform is proposed. The combined method is proposed to
overcome the limitation of wavelet based edge detection and
mathematical morphology based edge detection in noisy images.
Experimental results show superiority of the proposed method, as
compared to the traditional Prewitt, wavelet based and morphology
based edge detection methods. The proposed method is an effective
edge detection method for noisy image and keeps clear and
continuous edges.
Abstract: Kepsut-Dursunbey volcanic field (KDVF) is located
in NW Turkey and contains various products of the post-collisional
Neogene magmatic activity. Two distinct volcanic suites have been
recognized; the Kepsut volcanic suite (KVS) and the Dursunbey
volcanic suite (DVS). The KVS includes basaltic trachyandesitebasaltic
andesite-andesite lavas and associated pyroclastic rocks. The
DVS consists of dacite-rhyodacite lavas and extensive pumice-ash
fall and flow deposits. Petrographical features (i.e. existence of
xenocrysts, glomerocrysts, and mixing-compatible textures) and
mineral chemistry of phenocryst assemblages of both suites provide
evidence for magma mixing/AFC. Calculated crystallization
pressures and temperatures give values of 5.7–7.0 kbar and 927–982
°C for the KVS and 3.7–5.3 kbar and 783-787°C for the DVS,
indicating separate magma reservoirs and crystallization in magma
chambers at deep and mid crustal levels, respectively. These
observations support the establishment and evolution of KDVF
magma system promoted by episodic basaltic inputs which may
generate and mix with crustal melts.
Abstract: Chaos and fractals are novel fields of physics and mathematics showing up a new way of universe viewpoint and creating many ideas to solve several present problems. In this paper, a novel algorithm based on the chaotic sequence generator with the highest ability to adapt and reach the global optima is proposed. The adaptive ability of proposal algorithm is flexible in 2 steps. The first one is a breadth-first search and the second one is a depth-first search. The proposal algorithm is examined by 2 functions, the Camel function and the Schaffer function. Furthermore, the proposal algorithm is applied to optimize training Multilayer Neural Networks.
Abstract: Human identification at a distance has recently gained
growing interest from computer vision researchers. Gait recognition
aims essentially to address this problem by identifying people based
on the way they walk [1]. Gait recognition has 3 steps. The first step
is preprocessing, the second step is feature extraction and the third
one is classification. This paper focuses on the classification step that
is essential to increase the CCR (Correct Classification Rate).
Multilayer Perceptron (MLP) is used in this work. Neural Networks
imitate the human brain to perform intelligent tasks [3].They can
represent complicated relationships between input and output and
acquire knowledge about these relationships directly from the data
[2]. In this paper we apply MLP NN for 11 views in our database and
compare the CCR values for these views. Experiments are performed
with the NLPR databases, and the effectiveness of the proposed
method for gait recognition is demonstrated.
Abstract: High quality requirements analysis is one of the most
crucial activities to ensure the success of a software project, so that
requirements verification for software system becomes more and more
important in Requirements Engineering (RE) and it is one of the most
helpful strategies for improving the quality of software system.
Related works show that requirement elicitation and analysis can be
facilitated by ontological approaches and semantic web technologies.
In this paper, we proposed a hybrid method which aims to verify
requirements with structural and formal semantics to detect
interactions. The proposed method is twofold: one is for modeling
requirements with the semantic web language OWL, to construct a
semantic context; the other is a set of interaction detection rules which
are derived from scenario-based analysis and represented with
semantic web rule language (SWRL). SWRL based rules are working
with rule engines like Jess to reason in semantic context for
requirements thus to detect interactions. The benefits of the proposed
method lie in three aspects: the method (i) provides systematic steps
for modeling requirements with an ontological approach, (ii) offers
synergy of requirements elicitation and domain engineering for
knowledge sharing, and (3)the proposed rules can systematically assist
in requirements interaction detection.
Abstract: The hypercube Qn is one of the most well-known
and popular interconnection networks and the k-ary n-cube Qk
n is
an enlarged family from Qn that keeps many pleasing properties
from hypercubes. In this article, we study the panpositionable
hamiltonicity of Qk
n for k ≥ 3 and n ≥ 2. Let x, y of V (Qk
n)
be two arbitrary vertices and C be a hamiltonian cycle of Qk
n.
We use dC(x, y) to denote the distance between x and y on the
hamiltonian cycle C. Define l as an integer satisfying d(x, y) ≤ l ≤ 1
2 |V (Qk
n)|. We prove the followings:
• When k = 3 and n ≥ 2, there exists a hamiltonian cycle C
of Qk
n such that dC(x, y) = l.
• When k ≥ 5 is odd and n ≥ 2, we request that l /∈ S
where S is a set of specific integers. Then there exists a
hamiltonian cycle C of Qk
n such that dC(x, y) = l.
• When k ≥ 4 is even and n ≥ 2, we request l-d(x, y) to be
even. Then there exists a hamiltonian cycle C of Qk
n such
that dC(x, y) = l.
The result is optimal since the restrictions on l is due to the
structure of Qk
n by definition.
Abstract: The purpose of this work is fast design optimization of
the seal chamber. The study includes the mass transfer between lower
and upper chamber on seal chamber for hot water application pumps.
The use of Fluent 12.1 commercial code made it possible to capture
complex flow with heat-mass transfer, radiation, Tailor instability,
and buoyancy effect. Realizable k-epsilon model was used for
turbulence modeling. Radiation heat losses were taken into account.
The temperature distribution at seal region is predicted with respect
to heat addition.
Results show the possibilities of the model simplifications by
excluding the water domain in low chamber from calculations. CFD
simulations permit to improve seal chamber design to meet target
water temperature around the seal. This study can be used for the
analysis of different seal chamber configurations.
Abstract: A phorbol-12-myristate-13-acetate (TPA) is a synthetic analogue of phorbol ester (PE), a natural toxic compound of Euphorbiaceae plant. The oil extracted from plants of this family is useful source for primarily biofuel. However this oil might also be used as a foodstuff due to its significant nutrition content. The limitations for utilizing the oil as a foodstuff are mainly due to a toxicity of PE. Currently, a majority of PE detoxification processes are expensive as include multi steps alcohol extraction sequence.
Ozone is considered as a strong oxidative agent. It reacts with PE by attacking the carbon-carbon double bond of PE. This modification of PE molecular structure yields a non toxic ester with high lipid content.
This report presents data on development of simple and cheap PE detoxification process with water application as a buffer and ozone as reactive component. The core of this new technique is an application for a new microscale plasma unit to ozone production and the technology permits ozone injection to the water-TPA mixture in form of microbubbles.
The efficacy of a heterogeneous process depends on the diffusion coefficient which can be controlled by contact time and interfacial area. The low velocity of rising microbubbles and high surface to volume ratio allow efficient mass transfer to be achieved during the process. Direct injection of ozone is the most efficient way to process with such highly reactive and short lived chemical.
Data on the plasma unit behavior are presented and the influence of gas oscillation technology on the microbubble production mechanism has been discussed. Data on overall process efficacy for TPA degradation is shown.
Abstract: An optimal solution for a large number of constraint
satisfaction problems can be found using the technique of
substitution and elimination of variables analogous to the technique
that is used to solve systems of equations. A decision function
f(A)=max(A2) is used to determine which variables to eliminate. The
algorithm can be expressed in six lines and is remarkable in both its
simplicity and its ability to find an optimal solution. However it is
inefficient in that it needs to square the updated A matrix after each
variable elimination. To overcome this inefficiency the algorithm is
analyzed and it is shown that the A matrix only needs to be squared
once at the first step of the algorithm and then incrementally updated
for subsequent steps, resulting in significant improvement and an
algorithm complexity of O(n3).