Abstract: Failure modes and effects analysis (FMEA) is an effective technique for preventing potential problems and actions needed to error cause removal. On the other hand, the oil producing companies paly a critical role in the oil industry of Iran as a developing country out of which, Sepahan Oil Co. has a considerable contribution. The aim of this research is to show how FMEA could be applied and improve the quality of products at Sepahan Oil Co. For this purpose, the four liter production line of the company has been selected for investigation. The findings imply that the application of FMEA has reduced the scraps from 50000 ppm to 5000 ppm and has resulted in a 0.92 percent decrease of the oil waste.
Abstract: Cognitive models allow predicting some aspects of utility
and usability of human machine interfaces (HMI), and simulating
the interaction with these interfaces. The action of predicting is based
on a task analysis, which investigates what a user is required to do
in terms of actions and cognitive processes to achieve a task. Task
analysis facilitates the understanding of the system-s functionalities.
Cognitive models are part of the analytical approaches, that do not
associate the users during the development process of the interface.
This article presents a study about the evaluation of a human
machine interaction with a contextual assistant-s interface using ACTR
and GOMS cognitive models. The present work shows how these
techniques may be applied in the evaluation of HMI, design and
research by emphasizing firstly the task analysis and secondly the
time execution of the task. In order to validate and support our
results, an experimental study of user performance is conducted at
the DOMUS laboratory, during the interaction with the contextual
assistant-s interface. The results of our models show that the GOMS
and ACT-R models give good and excellent predictions respectively
of users performance at the task level, as well as the object level.
Therefore, the simulated results are very close to the results obtained
in the experimental study.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization (MSSVQ), which is a hybrid of Multi, switched,
split vector quantization techniques. The spectral distortion
performance, computational complexity, and memory requirements
of MSSVQ are compared to split vector quantization (SVQ), multi
stage vector quantization(MSVQ) and switched split vector
quantization (SSVQ) techniques. It has been proved from results that
MSSVQ has better spectral distortion performance, lower
computational complexity and lower memory requirements when
compared to all the above mentioned product code vector
quantization techniques. Computational complexity is measured in
floating point operations (flops), and memory requirements is
measured in (floats).
Abstract: This paper aims to develop a NOx emission model of
an acid gas incinerator using Nelder-Mead least squares support
vector regression (LS-SVR). Malaysia DOE is actively imposing the
Clean Air Regulation to mandate the installation of analytical
instrumentation known as Continuous Emission Monitoring System
(CEMS) to report emission level online to DOE . As a hardware
based analyzer, CEMS is expensive, maintenance intensive and often
unreliable. Therefore, software predictive technique is often
preferred and considered as a feasible alternative to replace the
CEMS for regulatory compliance. The LS-SVR model is built based
on the emissions from an acid gas incinerator that operates in a LNG
Complex. Simulated Annealing (SA) is first used to determine the
initial hyperparameters which are then further optimized based on the
performance of the model using Nelder-Mead simplex algorithm.
The LS-SVR model is shown to outperform a benchmark model
based on backpropagation neural networks (BPNN) in both training
and testing data.
Abstract: Clustering is a very well known technique in data mining. One of the most widely used clustering techniques is the k-means algorithm. Solutions obtained from this technique are dependent on the initialization of cluster centers. In this article we propose a new algorithm to initialize the clusters. The proposed algorithm is based on finding a set of medians extracted from a dimension with maximum variance. The algorithm has been applied to different data sets and good results are obtained.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: Fast depth estimation from binocular vision is often
desired for autonomous vehicles, but, most algorithms could not easily
be put into practice because of the much time cost. We present an
image-processing technique that can fast estimate depth image from
binocular vision images. By finding out the lines which present the
best matched area in the disparity space image, the depth can be
estimated. When detecting these lines, an edge-emphasizing filter is
used. The final depth estimation will be presented after the smooth
filter. Our method is a compromise between local methods and global
optimization.
Abstract: In this paper, an improved technique for contingency
ranking using artificial neural network (ANN) is presented. The
proposed approach is based on multi-layer perceptrons trained by
backpropagation to contingency analysis. Severity indices in dynamic
stability assessment are presented. These indices are based on the
concept of coherency and three dot products of the system variables.
It is well known that some indices work better than others for a
particular power system. This paper along with test results using
several different systems, demonstrates that combination of indices
with ANN provides better ranking than a single index. The presented
results are obtained through the use of power system simulation
(PSS/E) and MATLAB 6.5 software.
Abstract: In this paper a technique for increasing the
convergence rate of fractionally spaced channel equalizer is
proposed. Instead of symbol-spaced updating of the equalizer filter, a
mechanism has been devised to update the filter at a higher rate. This
ensures convergence of the equalizer filter at a higher rate and
therefore less time-consuming. The proposed technique has been
simulated and tested for two-ray modeled channels with various
delay spreads. These channels include minimum-phase and nonminimum-
phase channels. Simulation results suggest that that
proposed technique outperforms the conventional technique of
symbol-spaced updating of equalizer filter.
Abstract: The zinc and iron environments in different growth
stages have been studied with EXAFS and XANES with Brookhaven
Synchrotron Light Source. Tissue samples included meat, organ,
vegetable, leaf, and yeast. The project studied the EXAFS and
XANES of tissue samples using Zn and Fe K-edges. Duck embryo
samples show that brain and intestine would contain shorter EXFAS
determined Zn-N/O bond; as with the cases of fresh yeast versus
reconstituted live yeast and green leaf versus yellow leaf. The
XANES Fourier transform characteristic-length would be useful as a
functionality index for selected types of tissue samples in various
physical states. The extension to the development of functional
synchrotron imaging for tissue engineering application based on
spectroscopic technique is discussed.
Abstract: Consumer electronics are pervasive. It is impossible to
imagine a household or office without DVD players, digital cameras,
printers, mobile phones, shavers, electrical toothbrushes, etc. All
these devices operate at different voltage levels ranging from 1.8 to
20 VDC, in the absence of universal standards. The voltages
available are however usually 120/230 VAC at 50/60 Hz. This
situation makes an individual electrical energy conversion system
necessary for each device. Such converters usually involve several
conversion stages and often operate with excessive losses and poor
reliability. The aim of the project presented in this paper is to design
and implement a multi-channel DC/DC converter system,
customizing the output voltage and current ratings according to the
requirements of the load. Distributed, multi-agent techniques will be
applied for the control of the DC/DC converters.
Abstract: An array antenna system with innovative signal
processing can improve the resolution of a source direction of arrival
(DoA) estimation. High resolution techniques take the advantage of
array antenna structures to better process the incoming waves. They
also have the capability to identify the direction of multiple targets.
This paper investigates performance of the DOA estimation
algorithm namely; Capon and MUSIC on the uniform linear array
(ULA). The simulation results show that in Capon and MUSIC
algorithm the resolution of the DOA techniques improves as number
of snapshots, number of array elements, signal-to-noise ratio and
separation angle between the two sources θ increases.
Abstract: Fluids are used for heat transfer in many engineering
equipments. Water, ethylene glycol and propylene glycol are some
of the common heat transfer fluids. Over the years, in an attempt to
reduce the size of the equipment and/or efficiency of the process,
various techniques have been employed to improve the heat transfer
rate of these fluids. Surface modification, use of inserts and
increased fluid velocity are some examples of heat transfer
enhancement techniques. Addition of milli or micro sized particles
to the heat transfer fluid is another way of improving heat transfer
rate. Though this looks simple, this method has practical problems
such as high pressure loss, clogging and erosion of the material of
construction. These problems can be overcome by using nanofluids,
which is a dispersion of nanosized particles in a base fluid.
Nanoparticles increase the thermal conductivity of the base fluid
manifold which in turn increases the heat transfer rate. In this work,
the heat transfer enhancement using aluminium oxide nanofluid has
been studied by computational fluid dynamic modeling of the
nanofluid flow adopting the single phase approach.
Abstract: Tea is consumed by a big part of the world-s
population. It has an enormous importance for the Turkish culture.
Nearly it is brewed every morning and evening at the all houses. Also it is consumed with lemon wedge. Habitual drinking of tea
infusions may significantly contribute to daily dietary requirements of elements.
Different instrumental techniques are used for determination of
these elements. But atomic and mass spectroscopic methods are preferred most. In these study chromium, iron and selenium contents
after the hot water brewing of black and green tea were determined
by Optical Emission Spectroscopy (ICP-OES). Furthermore, effect
of lemon addition on chromium, iron and selenium concentration tea
infusions is investigated.
Results of the investigation showed that concentration of
chromium, iron and selenium increased in black tea with lemon addition. On the other hand only selenium is increased with lemon
addition in green tea. And iron concentration is not detected in green
tea but its concentration is determined as 1.420 ppm after lemon addition.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.
Abstract: Gene expression profiling is rapidly evolving into a
powerful technique for investigating tumor malignancies. The
researchers are overwhelmed with the microarray-based platforms
and methods that confer them the freedom to conduct large-scale
gene expression profiling measurements. Simultaneously,
investigations into cross-platform integration methods have started
gaining momentum due to their underlying potential to help
comprehend a myriad of broad biological issues in tumor diagnosis,
prognosis, and therapy. However, comparing results from different
platforms remains to be a challenging task as various inherent
technical differences exist between the microarray platforms. In this
paper, we explain a simple ratio-transformation method, which can
provide some common ground for cDNA and Affymetrix platform
towards cross-platform integration. The method is based on the
characteristic data attributes of Affymetrix- and cDNA- platform. In
the work, we considered seven childhood leukemia patients and their
gene expression levels in either platform. With a dataset of 822
differentially expressed genes from both these platforms, we carried
out a specific ratio-treatment to Affymetrix data, which subsequently
showed an improvement in the relationship with the cDNA data.
Abstract: Herein, we report the different types of surface morphology due to the interaction between the pure protein Insulin (INS) and catanionic surfactant mixture of Sodium Dodecyl Sulfate (SDS) and Cetyl Trimethyl Ammonium Bromide (CTAB) at air/water interface obtained by the Langmuir-Blodgett (LB) technique. We characterized the aggregations by Scanning Electron Microscopy (SEM), Atomic Force Microscopy (AFM) and Fourier transform infrared spectroscopy (FTIR) in LB films. We found that the INS adsorption increased in presence of catanionic surfactant at air/water interface. The presence of small amount of surfactant induces two-stage growth kinetics due to the pure protein absorption and protein-catanionic surface micelle interaction. The protein remains in native state in presence of small amount of surfactant mixture. Smaller amount of surfactant mixture with INS is producing surface micelle type structure. This may be considered for drug delivery system. On the other hand, INS becomes unfolded and fibrillated in presence of higher amount of surfactant mixture. In both the cases, the protein was successfully immobilized on a glass substrate by the LB technique. These results may find applications in the fundamental science of the physical chemistry of surfactant systems, as well as in the preparation of drug-delivery system.
Abstract: Efficient preprocessing is very essential for automatic
recognition of handwritten documents. In this paper, techniques on
segmenting words in handwritten Arabic text are presented. Firstly,
connected components (ccs) are extracted, and distances among
different components are analyzed. The statistical distribution of this
distance is then obtained to determine an optimal threshold for words
segmentation. Meanwhile, an improved projection based method is
also employed for baseline detection. The proposed method has been
successfully tested on IFN/ENIT database consisting of 26459
Arabic words handwritten by 411 different writers, and the results
were promising and very encouraging in more accurate detection of
the baseline and segmentation of words for further recognition.
Abstract: This paper describes a method of modeling to model
shadow play puppet using sophisticated computer graphics techniques
available in OpenGL in order to allow interactive play in real-time
environment as well as producing realistic animation. This paper
proposes a novel real-time method is proposed for modeling of puppet
and its shadow image that allows interactive play of virtual shadow
play using texture mapping and blending techniques. Special effects
such as lighting and blurring effects for virtual shadow play
environment are also developed. Moreover, the use of geometric
transformations and hierarchical modeling facilitates interaction
among the different parts of the puppet during animation. Based on the
experiments and the survey that were carried out, the respondents
involved are very satisfied with the outcomes of these techniques.
Abstract: This paper deals with the modeling and the evaluation of a multiplicative phase noise influence on the bit error ratio in a general space communication system. Our research is focused on systems with multi-state phase shift keying modulation techniques and it turns out, that the phase noise significantly affects the bit error rate, especially for higher signal to noise ratios. These results come from a system model created in Matlab environment and are shown in a form of constellation diagrams and bit error rate dependencies. The change of a user data bit rate is also considered and included into simulation results. Obtained outcomes confirm theoretical presumptions.