Abstract: Adaptive Genetic Algorithms extend the Standard Gas
to use dynamic procedures to apply evolutionary operators such as
crossover, mutation and selection. In this paper, we try to propose a
new adaptive genetic algorithm, which is based on the statistical
information of the population as a guideline to tune its crossover,
selection and mutation operators. This algorithms is called Statistical
Genetic Algorithm and is compared with traditional GA in some
benchmark problems.
Abstract: In this paper the development of a heat exchanger as a
pilot plant for educational purpose is discussed and the use of neural
network for controlling the process is being presented. The aim of the
study is to highlight the need of a specific Pseudo Random Binary
Sequence (PRBS) to excite a process under control. As the neural
network is a data driven technique, the method for data generation
plays an important role. In light of this a careful experimentation
procedure for data generation was crucial task. Heat exchange is a
complex process, which has a capacity and a time lag as process
elements. The proposed system is a typical pipe-in- pipe type heat
exchanger. The complexity of the system demands careful selection,
proper installation and commissioning. The temperature, flow, and
pressure sensors play a vital role in the control performance. The
final control element used is a pneumatically operated control valve.
While carrying out the experimentation on heat exchanger a welldrafted
procedure is followed giving utmost attention towards safety
of the system. The results obtained are encouraging and revealing
the fact that if the process details are known completely as far as
process parameters are concerned and utilities are well stabilized then
feedback systems are suitable, whereas neural network control
paradigm is useful for the processes with nonlinearity and less
knowledge about process. The implementation of NN control
reinforces the concepts of process control and NN control paradigm.
The result also underlined the importance of excitation signal
typically for that process. Data acquisition, processing, and
presentation in a typical format are the most important parameters
while validating the results.
Abstract: This paper describes various stages of design and prototyping of a modular robot for use in various industrial applications. The major goal of current research has been to design and make different robotic joints at low cost capable of being assembled together in any given order for achieving various robot configurations. Five different types of joins were designed and manufactured where extensive research has been carried out on the design of each joint in order to achieve optimal strength, size, modularity, and price. This paper presents various stages of research and development undertaken to engineer these joints that include material selection, manufacturing, and strength analysis. The outcome of this research addresses the birth of a new generation of modular industrial robots with a wider range of applications and greater efficiency.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: In the recent years multimedia traffic and in particular
VoIP services are growing dramatically. We present a new algorithm
to control the resource utilization and to optimize the voice codec
selection during SIP call setup on behalf of the traffic condition
estimated on the network path.
The most suitable methodologies and the tools that perform realtime
evaluation of the available bandwidth on a network path have
been integrated with our proposed algorithm: this selects the best
codec for a VoIP call in function of the instantaneous available
bandwidth on the path. The algorithm does not require any explicit
feedback from the network, and this makes it easily deployable over
the Internet. We have also performed intensive tests on real network
scenarios with a software prototype, verifying the algorithm
efficiency with different network topologies and traffic patterns
between two SIP PBXs.
The promising results obtained during the experimental validation
of the algorithm are now the basis for the extension towards a larger
set of multimedia services and the integration of our methodology
with existing PBX appliances.
Abstract: The evolution of technology and construction techniques has enabled the upgrading of transport networks. In particular, the high-speed rail networks allow convoys to peak at above 300 km/h. These structures, however, often significantly impact the surrounding environment. Among the effects of greater importance are the ones provoked by the soundwave connected to train transit. The wave propagation affects the quality of life in areas surrounding the tracks, often for several hundred metres. There are substantial damages to properties (buildings and land), in terms of market depreciation. The present study, integrating expertise in acoustics, computering and evaluation fields, outlines a useful model to select project paths so as to minimize the noise impact and reduce the causes of possible litigation. It also facilitates the rational selection of initiatives to contain the environmental damage to the already existing railway tracks. The research is developed with reference to the Italian regulatory framework (usually more stringent than European and international standards) and refers to a case study concerning the high speed network in Italy.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: The purpose of this research is to develop a security model for voice eavesdropping protection over digital networks. The proposed model provides an encryption scheme and a personal secret key exchange between communicating parties, a so-called voice data transformation system, resulting in a real-privacy conversation. The operation of this system comprises two main steps as follows: The first one is the personal secret key exchange for using the keys in the data encryption process during conversation. The key owner could freely make his/her choice in key selection, so it is recommended that one should exchange a different key for a different conversational party, and record the key for each case into the memory provided in the client device. The next step is to set and record another personal option of encryption, either taking all frames or just partial frames, so-called the figure of 1:M. Using different personal secret keys and different sets of 1:M to different parties without the intervention of the service operator, would result in posing quite a big problem for any eavesdroppers who attempt to discover the key used during the conversation, especially in a short period of time. Thus, it is quite safe and effective to protect the case of voice eavesdropping. The results of the implementation indicate that the system can perform its function accurately as designed. In this regard, the proposed system is suitable for effective use in voice eavesdropping protection over digital networks, without any requirements to change presently existing network systems, mobile phone network and VoIP, for instance.
Abstract: This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.
Abstract: Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.
Abstract: In this paper, cloud resource broker using goalbased
request in medical application is proposed. To handle recent
huge production of digital images and data in medical informatics
application, the cloud resource broker could be used by medical
practitioner for proper process in discovering and selecting correct
information and application. This paper summarizes several
reviewed articles to relate medical informatics application with
current broker technology and presents a research work in applying
goal-based request in cloud resource broker to optimize the use of
resources in cloud environment. The objective of proposing a new
kind of resource broker is to enhance the current resource
scheduling, discovery, and selection procedures. We believed that
it could help to maximize resources allocation in medical
informatics application.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: The zero inflated models are usually used in modeling
count data with excess zeros where the existence of the excess zeros
could be structural zeros or zeros which occur by chance. These type
of data are commonly found in various disciplines such as finance,
insurance, biomedical, econometrical, ecology, and health sciences
which involve sex and health dental epidemiology. The most popular
zero inflated models used by many researchers are zero inflated
Poisson and zero inflated negative binomial models. In addition, zero
inflated generalized Poisson and zero inflated double Poisson models
are also discussed and found in some literature. Recently zero
inflated inverse trinomial model and zero inflated strict arcsine
models are advocated and proven to serve as alternative models in
modeling overdispersed count data caused by excessive zeros and
unobserved heterogeneity. The purpose of this paper is to review
some related literature and provide a variety of examples from
different disciplines in the application of zero inflated models.
Different model selection methods used in model comparison are
discussed.
Abstract: This study applies the sequential panel selection
method (SPSM) procedure proposed by Chortareas and Kapetanios
(2009) to investigate the time-series properties of energy
consumption in 50 US states from 1963 to 2009. SPSM involves the
classification of the entire panel into a group of stationary series and
a group of non-stationary series to identify how many and which
series in the panel are stationary processes. Empirical results obtained
through SPSM with the panel KSS unit root test developed by Ucar
and Omay (2009) combined with a Fourier function indicate that
energy consumption in all the 50 US states are stationary. The results
of this study have important policy implications for the 50 US states.
Abstract: This paper addresses the problems encountered by conventional distance relays when protecting double-circuit transmission lines. The problems arise principally as a result of the mutual coupling between the two circuits under different fault conditions; this mutual coupling is highly nonlinear in nature. An adaptive protection scheme is proposed for such lines based on application of artificial neural network (ANN). ANN has the ability to classify the nonlinear relationship between measured signals by identifying different patterns of the associated signals. One of the key points of the present work is that only current signals measured at local end have been used to detect and classify the faults in the double circuit transmission line with double end infeed. The adaptive protection scheme is tested under a specific fault type, but varying fault location, fault resistance, fault inception angle and with remote end infeed. An improved performance is experienced once the neural network is trained adequately, which performs precisely when faced with different system parameters and conditions. The entire test results clearly show that the fault is detected and classified within a quarter cycle; thus the proposed adaptive protection technique is well suited for double circuit transmission line fault detection & classification. Results of performance studies show that the proposed neural network-based module can improve the performance of conventional fault selection algorithms.
Abstract: Surface metrology with image processing is a challenging task having wide applications in industry. Surface roughness can be evaluated using texture classification approach. Important aspect here is appropriate selection of features that characterize the surface. We propose an effective combination of features for multi-scale and multi-directional analysis of engineering surfaces. The features include standard deviation, kurtosis and the Canny edge detector. We apply the method by analyzing the surfaces with Discrete Wavelet Transform (DWT) and Dual-Tree Complex Wavelet Transform (DT-CWT). We used Canberra distance metric for similarity comparison between the surface classes. Our database includes the surface textures manufactured by three machining processes namely Milling, Casting and Shaping. The comparative study shows that DT-CWT outperforms DWT giving correct classification performance of 91.27% with Canberra distance metric.
Abstract: A Data Warehouses is a repository of information
integrated from source data. Information stored in data warehouse is
the form of materialized in order to provide the better performance
for answering the queries. Deciding which appropriated views to be
materialized is one of important problem. In order to achieve this
requirement, the constructing search space close to optimal is a
necessary task. It will provide effective result for selecting view to be
materialized. In this paper we have proposed an approach to reoptimize
Multiple View Processing Plan (MVPP) by using global
common subexpressions. The merged queries which have query
processing cost not close to optimal would be rewritten. The
experiment shows that our approach can help to improve the total
query processing cost of MVPP and sum of query processing cost
and materialized view maintenance cost is reduced as well after views
are selected to be materialized.
Abstract: This article addresses feature selection for breast
cancer diagnosis. The present process contains a wrapper approach
based on Genetic Algorithm (GA) and case-based reasoning (CBR).
GA is used for searching the problem space to find all of the possible
subsets of features and CBR is employed to estimate the evaluation
result of each subset. The results of experiment show that the
proposed model is comparable to the other models on Wisconsin
breast cancer (WDBC) dataset.
Abstract: This paper is a description approach to predict
incoming and outgoing data rate in network system by using
association rule discover, which is one of the data mining
techniques. Information of incoming and outgoing data in each
times and network bandwidth are network performance
parameters, which needed to solve in the traffic problem. Since
congestion and data loss are important network problems. The result
of this technique can predicted future network traffic. In addition,
this research is useful for network routing selection and network
performance improvement.
Abstract: Chemical industry project management involves
complex decision making situations that require discerning abilities
and methods to make sound decisions. Project managers are faced
with decision environments and problems in projects that are
complex. In this work, case study is Research and Development
(R&D) project selection. R&D is an ongoing process for forward
thinking technology-based chemical industries. R&D project
selection is an important task for organizations with R&D project
management. It is a multi-criteria problem which includes both
tangible and intangible factors. The ability to make sound decisions
is very important to success of R&D projects. Multiple-criteria
decision making (MCDM) approaches are major parts of decision
theory and analysis. This paper presents all of MCDM approaches
for use in R&D project selection. It is hoped that this work will
provide a ready reference on MCDM and this will encourage the
application of the MCDM by chemical engineering management.