Abstract: With the necessity of increased processing capacity with less energy consumption; power aware multiprocessor system has gained more attention in the recent future. One of the additional challenges that is to be solved in a multi-processor system when compared to uni-processor system is job allocation. This paper presents a novel task dependent job allocation algorithm: Energy centric- Allocation (Ec-A) and Rate Monotonic (RM) scheduling to minimize energy consumption in a multiprocessor system. A simulation analysis is carried out to verify the performance increase with reduction in energy consumption and required number of processors in the system.
Abstract: In the age of global communications, heterogeneous
networks are seen to be the best choice of strategy to ensure continuous and uninterruptible services. This will allow mobile
terminal to stay in connection even they are migrating into different segment coverage through the handoff process. With the increase of
teletraffic demands in mobile cellular system, hierarchical cellular systems have been adopted extensively for more efficient channel
utilization and better QoS (Quality of Service). This paper presents a
bidirectional call overflow scheme between two layers of microcells and macrocells, where handoffs are decided by the velocity of mobile
making the call. To ensure that handoff calls are given higher priorities, it is assumed that guard channels are assigned in both
macrocells and microcells. A hysteresis value introduced in mobile velocity is used to allow mobile roam in the same cell if its velocity
changes back within the set threshold values. By doing this the number of handoffs is reduced thereby reducing the processing overhead and enhancing the quality of service to the end user.
Abstract: A cart-ball system is a challenging system from the
control engineering point of view. This is due to the nonlinearities,
multivariable, and non-minimum phase behavior present in this
system. This paper is concerned with the problem of modeling and
control of such system. The objective of control strategy is to place
the cart at a desired position while balancing the ball on the top of the
arc-shaped track fixed on the cart. A State-Feedback Controller
(SFC) with a pole-placement method will be designed in order to
control the system. At first, the mathematical model of a cart-ball
system in the state-space form is developed. Then, the linearization of
a model will be established in order to design a SFC. The integral
control strategy will be performed as to control the cart position of a
system. Simulation work is then performed using
MATLAB/SIMULINK software in order to study the performance of
SFC when applied to the system.
Abstract: A predictive clustering hybrid regression (pCHR)
approach was developed and evaluated using dataset from H2-
producing sucrose-based bioreactor operated for 15 months. The aim
was to model and predict the H2-production rate using information
available about envirome and metabolome of the bioprocess. Selforganizing
maps (SOM) and Sammon map were used to visualize the
dataset and to identify main metabolic patterns and clusters in
bioprocess data. Three metabolic clusters: acetate coupled with other
metabolites, butyrate only, and transition phases were detected. The
developed pCHR model combines principles of k-means clustering,
kNN classification and regression techniques. The model performed
well in modeling and predicting the H2-production rate with mean
square error values of 0.0014 and 0.0032, respectively.
Abstract: This paper made an attempt to investigate the problem associated with enhancement of emulsions of light crude oil-water recovery in an oil field of Algerian Sahara. Measurements were taken through experiments using RheoStress (RS600). Factors such as shear rate, temperature and light oil concentration on the viscosity behavior were considered. Experimental measurements were performed in terms of shear stress–shear rate, yield stress and flow index on mixture of light crude oil–water. The rheological behavior of emulsion showed Non-Newtonian shear thinning behavior (Herschel-Bulkley). The experiments done in the laboratory showed the stability of some water in light crude oil emulsions form during consolidate oil recovery process. To break the emulsion using additives may involve higher cost and could be very expensive. Therefore, further research should be directed to find solution of these problems that have been encountered.
Abstract: The cables in a nuclear power plant are designed to be
used for about 40 years in safe operation environment. However, the
heat and radiation in the nuclear power plant causes the rapid
performance deterioration of cables in nuclear vessels and heat
exchangers, which requires cable lifetime estimation. The most
accurate method of estimating the cable lifetime is to evaluate the
cables in a laboratory. However, removing cables while the plant is
operating is not allowed because of its safety and cost. In this paper, a
robot system to estimate the cable lifetime in nuclear power plants is
developed and tested. The developed robot system can calculate a
modulus value to estimate the cable lifetime even when the nuclear
power plant is in operation.
Abstract: Network security attacks are the violation of
information security policy that received much attention to the
computational intelligence society in the last decades. Data mining
has become a very useful technique for detecting network intrusions
by extracting useful knowledge from large number of network data
or logs. Naïve Bayesian classifier is one of the most popular data
mining algorithm for classification, which provides an optimal way
to predict the class of an unknown example. It has been tested that
one set of probability derived from data is not good enough to have
good classification rate. In this paper, we proposed a new learning
algorithm for mining network logs to detect network intrusions
through naïve Bayesian classifier, which first clusters the network
logs into several groups based on similarity of logs, and then
calculates the prior and conditional probabilities for each group of
logs. For classifying a new log, the algorithm checks in which cluster
the log belongs and then use that cluster-s probability set to classify
the new log. We tested the performance of our proposed algorithm by
employing KDD99 benchmark network intrusion detection dataset,
and the experimental results proved that it improves detection rates
as well as reduces false positives for different types of network
intrusions.
Abstract: In this article, various models of surface tension force (CSF, CSS and PCIL) for interfacial flows have been applied to dynamic case and the results were compared. We studied the Kelvin- Helmholtz instabilities, which are produced by shear at the interface between two fluids with different physical properties. The velocity inlet is defined as a sinusoidal perturbation. When gravity and surface tension are taking into account, we observe the development of the Instability for a critic value of the difference of velocity of the both fluids. The VOF Model enables to simulate Kelvin-Helmholtz Instability as dynamic case.
Abstract: Study of fire and explosion is very important mainly
in oil and gas industries due to several accidents which have been
reported in the past and present. In this work, we have investigated
the flammability of bio oil vapour mixtures. This mixture may
contribute to fire during the storage and transportation process. Bio
oil sample derived from Palm Kernell shell was analysed using Gas
Chromatography Mass Spectrometry (GC-MS) to examine the
composition of the sample. Mole fractions of 12 selected
components in the liquid phase were obtained from the GC-FID data
and used to calculate mole fractions of components in the gas phase
via modified Raoult-s law. Lower Flammability Limits (LFLs) and
Upper Flammability Limits (UFLs) for individual components were
obtained from published literature. However, stoichiometric
concentration method was used to calculate the flammability limits
of some components which their flammability limit values are not
available in the literature. The LFL and UFL values for the mixture
were calculated using the Le Chatelier equation. The LFLmix and
UFLmix values were used to construct a flammability diagram and
subsequently used to determine the flammability of the mixture. The
findings of this study can be used to propose suitable inherently
safer method to prevent the flammable mixture from occurring and
to minimizing the loss of properties, business, and life due to fire
accidents in bio oil productions.
Abstract: Semisolid metal processing uses solid–liquid slurries
containing fine and globular solid particles uniformly distributed in a
liquid matrix, which can be handled as a solid and flow like a liquid.
In the recent years, many methods have been introduced for the
production of semisolid slurries since it is scientifically sound and
industrially viable with such preferred microstructures called
thixotropic microstructures as feedstock materials. One such process
that needs very low equipment investment and running costs is the
cooling slope. In this research by using a mechanical stirrer slurry
maker constructed by the authors, the effects of mechanical stirring
parameters such as: stirring time, stirring temperature and stirring
Speed on micro-structure and mechanical properties of A360
aluminum alloy in semi-solid forming, are investigated. It is
determined that mold temperature and holding time of part in
temperature of 580ºC have a great effect on micro-structure and
mechanical properties(stirring temperature of 585ºC, stirring time of
20 minutes and stirring speed of 425 RPM). By optimizing the
forming parameters, dendrite microstructure changes to globular and
mechanical properties improves. This is because of breaking and
globularzing dendrites of primary α-AL.
Abstract: Modeling and simulation of biochemical reactions is of great interest in the context of system biology. The central dogma of this re-emerging area states that it is system dynamics and organizing principles of complex biological phenomena that give rise to functioning and function of cells. Cell functions, such as growth, division, differentiation and apoptosis are temporal processes, that can be understood if they are treated as dynamic systems. System biology focuses on an understanding of functional activity from a system-wide perspective and, consequently, it is defined by two hey questions: (i) how do the components within a cell interact, so as to bring about its structure and functioning? (ii) How do cells interact, so as to develop and maintain higher levels of organization and functions? In recent years, wet-lab biologists embraced mathematical modeling and simulation as two essential means toward answering the above questions. The credo of dynamics system theory is that the behavior of a biological system is given by the temporal evolution of its state. Our understanding of the time behavior of a biological system can be measured by the extent to which a simulation mimics the real behavior of that system. Deviations of a simulation indicate either limitations or errors in our knowledge. The aim of this paper is to summarize and review the main conceptual frameworks in which models of biochemical networks can be developed. In particular, we review the stochastic molecular modelling approaches, by reporting the principal conceptualizations suggested by A. A. Markov, P. Langevin, A. Fokker, M. Planck, D. T. Gillespie, N. G. van Kampfen, and recently by D. Wilkinson, O. Wolkenhauer, P. S. Jöberg and by the author.
Abstract: Classifying biomedical literature is a difficult and
challenging task, especially when a large number of biomedical
articles should be organized into a hierarchical structure. In this paper,
we present an approach for classifying a collection of biomedical text
abstracts downloaded from Medline database with the help of
ontology alignment. To accomplish our goal, we construct two types
of hierarchies, the OHSUMED disease hierarchy and the Medline
abstract disease hierarchies from the OHSUMED dataset and the
Medline abstracts, respectively. Then, we enrich the OHSUMED
disease hierarchy before adapting it to ontology alignment process for
finding probable concepts or categories. Subsequently, we compute
the cosine similarity between the vector in probable concepts (in the
“enriched" OHSUMED disease hierarchy) and the vector in Medline
abstract disease hierarchies. Finally, we assign category to the new
Medline abstracts based on the similarity score. The results obtained
from the experiments show the performance of our proposed approach
for hierarchical classification is slightly better than the performance of
the multi-class flat classification.
Abstract: This paper describes about the process of recognition and classification of brain images such as normal and abnormal based on PSO-SVM. Image Classification is becoming more important for medical diagnosis process. In medical area especially for diagnosis the abnormality of the patient is classified, which plays a great role for the doctors to diagnosis the patient according to the severeness of the diseases. In case of DICOM images it is very tough for optimal recognition and early detection of diseases. Our work focuses on recognition and classification of DICOM image based on collective approach of digital image processing. For optimal recognition and classification Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Support Vector Machine (SVM) are used. The collective approach by using PSO-SVM gives high approximation capability and much faster convergence.
Abstract: Currently, there are many local area industrial networks
that can give guaranteed bandwidth to synchronous traffic, particularly
providing CBR channels (Constant Bit Rate), which allow
improved bandwidth management. Some of such networks operate
over Ethernet, delivering channels with enough capacity, specially
with compressors, to integrate multimedia traffic in industrial monitoring
and image processing applications with many sources. In
these industrial environments where a low latency is an essential
requirement, JPEG is an adequate compressing technique but it
generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic
in CBR channels is inefficient and current solutions to this problem
significantly increase the latency or further degrade the quality. In
this paper an R(q) model is used which allows on-line calculation of
the JPEG quantification factor. We obtained increased quality, a lower
requirement for the CBR channel with reduced number of discarded
frames along with better use of the channel bandwidth.
Abstract: This paper deals with efficient computation of
probability coefficients which offers computational simplicity as
compared to spectral coefficients. It eliminates the need of inner
product evaluations in determination of signature of a combinational
circuit realizing given Boolean function. The method for computation
of probability coefficients using transform matrix, fast transform
method and using BDD is given. Theoretical relations for achievable
computational advantage in terms of required additions in computing
all 2n probability coefficients of n variable function have been
developed. It is shown that for n ≥ 5, only 50% additions are needed
to compute all probability coefficients as compared to spectral
coefficients. The fault detection techniques based on spectral
signature can be used with probability signature also to offer
computational advantage.
Abstract: The article presents test results on the changes
occurring in sewage sludge during the process of its storage. Tests
were conducted on mechanically dehydrated sewage sludge derived
from large municipal sewage treatment plants equipped with
biological sewage treatment systems. In testing presented in the paper
the focus was on the basic fuel properties of sewage sludge: moisture
content, heat of combustion, carbon share. In the first part of the
article the overview of the issues concerning the sewage sludge
management is presented and the genesis of tests is explained.
Further in the paper, selected results of conducted tests are discussed.
Changes in tested parameters were determined in the period of a 10-
month sewage storage.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.
Abstract: We introduce, a new interactive 3D simulation system of ocular motion and expressions suitable for: (1) character animation applications to game design, film production, HCI (Human Computer Interface), conversational animated agents, and virtual reality; (2) medical applications (ophthalmic neurological and muscular pathologies: research and education); and (3) real time simulation of unconscious cognitive and emotional responses (for use, e.g., in psychological research). The system is comprised of: (1) a physiologically accurate parameterized 3D model of the eyes, eyelids, and eyebrow regions; and (2) a prototype device for realtime control of eye motions and expressions, including unconsciously produced expressions, for application as in (1), (2), and (3) above. The 3D eye simulation system, created using state-of-the-art computer animation technology and 'optimized' for use with an interactive and web deliverable platform, is, to our knowledge, the most advanced/realistic available so far for applications to character animation and medical pedagogy.
Abstract: Today, building automation is advancing from simple
monitoring and control tasks of lightning and heating towards more
and more complex applications that require a dynamic perception
and interpretation of different scenes occurring in a building. Current
approaches cannot handle these newly upcoming demands. In this
article, a bionically inspired approach for multimodal, dynamic scene
perception and interpretation is presented, which is based on neuroscientific
and neuro-psychological research findings about the perceptual
system of the human brain. This approach bases on data from diverse
sensory modalities being processed in a so-called neuro-symbolic
network. With its parallel structure and with its basic elements being
information processing and storing units at the same time, a very
efficient method for scene perception is provided overcoming the
problems and bottlenecks of classical dynamic scene interpretation
systems.
Abstract: One major difficulty that faces developers of
concurrent and distributed software is analysis for concurrency based
faults like deadlocks. Petri nets are used extensively in the
verification of correctness of concurrent programs. ECATNets [2] are
a category of algebraic Petri nets based on a sound combination of
algebraic abstract types and high-level Petri nets. ECATNets have
'sound' and 'complete' semantics because of their integration in
rewriting logic [12] and its programming language Maude [13].
Rewriting logic is considered as one of very powerful logics in terms
of description, verification and programming of concurrent systems.
We proposed in [4] a method for translating Ada-95 tasking
programs to ECATNets formalism (Ada-ECATNet). In this paper,
we show that ECATNets formalism provides a more compact
translation for Ada programs compared to the other approaches based
on simple Petri nets or Colored Petri nets (CPNs). Such translation
doesn-t reduce only the size of program, but reduces also the number
of program states. We show also, how this compact Ada-ECATNet
may be reduced again by applying reduction rules on it. This double
reduction of Ada-ECATNet permits a considerable minimization of
the memory space and run time of corresponding Maude program.