Abstract: A robot simulator was developed to measure and
investigate the performance of a robot navigation system based on
the relative position of the robot with respect to random obstacles in
any two dimensional environment. The presented simulator focuses
on investigating the ability of a fuzzy-neural system for object
avoidance. A navigation algorithm is proposed and used to allow
random navigation of a robot among obstacles when the robot faces
an obstacle in the environment. The main features of this simulator
can be used for evaluating the performance of any system that can
provide the position of the robot with respect to obstacles in the
environment. This allows a robot developer to investigate and
analyze the performance of a robot without implementing the
physical robot.
Abstract: The Maximum Weighted Independent Set (MWIS)
problem is a classic graph optimization NP-hard problem. Given an
undirected graph G = (V, E) and weighting function defined on the
vertex set, the MWIS problem is to find a vertex set S V whose total
weight is maximum subject to no two vertices in S are adjacent. This
paper presents a novel approach to approximate the MWIS of a graph
using minimum weighted vertex cover of the graph. Computational
experiments are designed and conducted to study the performance
of our proposed algorithm. Extensive simulation results show that
the proposed algorithm can yield better solutions than other existing
algorithms found in the literature for solving the MWIS.
Abstract: The physical methods for RNA secondary structure prediction are time consuming and expensive, thus methods for computational prediction will be a proper alternative. Various algorithms have been used for RNA structure prediction including dynamic programming and metaheuristic algorithms. Musician's behaviorinspired harmony search is a recently developed metaheuristic algorithm which has been successful in a wide variety of complex optimization problems. This paper proposes a harmony search algorithm (HSRNAFold) to find RNA secondary structure with minimum free energy and similar to the native structure. HSRNAFold is compared with dynamic programming benchmark mfold and metaheuristic algorithms (RnaPredict, SetPSO and HelixPSO). The results showed that HSRNAFold is comparable to mfold and better than metaheuristics in finding the minimum free energies and the number of correct base pairs.
Abstract: This paper presents the work of signal discrimination
specifically for Electrocardiogram (ECG) waveform. ECG signal is
comprised of P, QRS, and T waves in each normal heart beat to
describe the pattern of heart rhythms corresponds to a specific
individual. Further medical diagnosis could be done to determine any
heart related disease using ECG information. The emphasis on QRS
Complex classification is further discussed to illustrate the
importance of it. Pan-Tompkins Algorithm, a widely known
technique has been adapted to realize the QRS Complex
classification process. There are eight steps involved namely
sampling, normalization, low pass filter, high pass filter (build a band
pass filter), derivation, squaring, averaging and lastly is the QRS
detection. The simulation results obtained is represented in a
Graphical User Interface (GUI) developed using MATLAB.
Abstract: Bioinformatics and Cheminformatics use computer as disciplines providing tools for acquisition, storage, processing, analysis, integrate data and for the development of potential applications of biological and chemical data. A chemical database is one of the databases that exclusively designed to store chemical information. NMRShiftDB is one of the main databases that used to represent the chemical structures in 2D or 3D structures. SMILES format is one of many ways to write a chemical structure in a linear format. In this study we extracted Antimicrobial Structures in SMILES format from NMRShiftDB and stored it in our Local Data Warehouse with its corresponding information. Additionally, we developed a searching tool that would response to user-s query using the JME Editor tool that allows user to draw or edit molecules and converts the drawn structure into SMILES format. We applied Quick Search algorithm to search for Antimicrobial Structures in our Local Data Ware House.
Abstract: This paper presents a new approach for the prob-ability density function estimation using the Support Vector Ma-chines (SVM) and the Expectation Maximization (EM) algorithms.In the proposed approach, an advanced algorithm for the SVM den-sity estimation which incorporates the Mean Field theory in the learning process is used. Instead of using ad-hoc values for the para-meters of the kernel function which is used by the SVM algorithm,the proposed approach uses the EM algorithm for an automatic optimization of the kernel. Experimental evaluation using simulated data set shows encouraging results.
Abstract: In this paper, we consider the control of time delay system
by Proportional-Integral (PI) controller. By Using the Hermite-
Biehler theorem, which is applicable to quasi-polynomials, we seek
a stability region of the controller for first order delay systems. The
essence of this work resides in the extension of this approach to
second order delay system, in the determination of its stability region
and the computation of the PI optimum parameters. We have used
the genetic algorithms to lead the complexity of the optimization
problem.
Abstract: Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.
Abstract: The proper design of RF pulses in magnetic resonance imaging (MRI) has a direct impact on the quality of acquired images, and is needed for many applications. Several techniques have been proposed to obtain the RF pulse envelope given the desired slice profile. Unfortunately, these techniques do not take into account the limitations of practical implementation such as limited amplitude resolution. Moreover, implementing constraints for special RF pulses on most techniques is not possible. In this work, we propose to develop an approach for designing optimal RF pulses under theoretically any constraints. The new technique will pose the RF pulse design problem as a combinatorial optimization problem and uses efficient techniques from this area such as genetic algorithms (GA) to solve this problem. In particular, an objective function will be proposed as the norm of the difference between the desired profile and the one obtained from solving the Bloch equations for the current RF pulse design values. The proposed approach will be verified using analytical solution based RF simulations and compared to previous methods such as Shinnar-Le Roux (SLR) method, and analysis, selected, and tested the options and parameters that control the Genetic Algorithm (GA) can significantly affect its performance to get the best improved results and compared to previous works in this field. The results show a significant improvement over conventional design techniques, select the best options and parameters for GA to get most improvement over the previous works, and suggest the practicality of using of the new technique for most important applications as slice selection for large flip angles, in the area of unconventional spatial encoding, and another clinical use.
Abstract: The shortest path routing problem is a multiobjective nonlinear optimization problem with constraints. This problem has been addressed by considering Quality of service parameters, delay and cost objectives separately or as a weighted sum of both objectives. Multiobjective evolutionary algorithms can find multiple pareto-optimal solutions in one single run and this ability makes them attractive for solving problems with multiple and conflicting objectives. This paper uses an elitist multiobjective evolutionary algorithm based on the Non-dominated Sorting Genetic Algorithm (NSGA), for solving the dynamic shortest path routing problem in computer networks. A priority-based encoding scheme is proposed for population initialization. Elitism ensures that the best solution does not deteriorate in the next generations. Results for a sample test network have been presented to demonstrate the capabilities of the proposed approach to generate well-distributed pareto-optimal solutions of dynamic routing problem in one single run. The results obtained by NSGA are compared with single objective weighting factor method for which Genetic Algorithm (GA) was applied.
Abstract: MANEMO is the integration of Network Mobility
(NEMO) and Mobile Ad Hoc Network (MANET). A MANEMO
node has an interface to both a MANET and NEMO network, and
therefore should choose the optimal interface for packet delivery,
however such a handover between interfaces will introduce packet
loss. We define the steps necessary for a MANEMO handover,
using Mobile IP and NEMO to signal the new binding to the
relevant Home Agent(s). The handover steps aim to minimize the
packet loss by avoiding waiting for Duplicate Address Detection
and Neighbour Unreachability Detection. We present expressions for
handover delay and packet loss, and then use numerical examples to
evaluate a MANEMO handover. The analysis shows how the packet
loss depends on level of nesting within NEMO, the delay between
Home Agents and the load on the MANET, and hence can be used
to developing optimal MANEMO handover algorithms.
Abstract: The Far From Most Strings Problem (FFMSP) is to obtain a string which is far from as many as possible of a given set of strings. All the input and the output strings are of the same length, and two strings are said to be far if their hamming distance is greater than or equal to a given positive integer. FFMSP belongs to the class of sequences consensus problems which have applications in molecular biology. The problem is NP-hard; it does not admit a constant-ratio approximation either, unless P = NP. Therefore, in addition to exact and approximate algorithms, (meta)heuristic algorithms have been proposed for the problem in recent years. On the other hand, in the recent years, hybrid algorithms have been proposed and successfully used for many hard problems in a variety of domains. In this paper, a new metaheuristic algorithm, called Constructive Beam and Local Search (CBLS), is investigated for the problem, which is a hybridization of constructive beam search and local search algorithms. More specifically, the proposed algorithm consists of two phases, the first phase is to obtain several candidate solutions via the constructive beam search and the second phase is to apply local search to the candidate solutions obtained by the first phase. The best solution found is returned as the final solution to the problem. The proposed algorithm is also similar to memetic algorithms in the sense that both use local search to further improve individual solutions. The CBLS algorithm is compared with the most recent published algorithm for the problem, GRASP, with significantly positive results; the improvement is by order of magnitudes in most cases.
Abstract: As a popular rank-reduced vector space approach,
Latent Semantic Indexing (LSI) has been used in information
retrieval and other applications. In this paper, an LSI-based content
vector model for text classification is presented, which constructs
multiple augmented category LSI spaces and classifies text by their
content. The model integrates the class discriminative information
from the training data and is equipped with several pertinent feature
selection and text classification algorithms. The proposed classifier
has been applied to email classification and its experiments on a
benchmark spam testing corpus (PU1) have shown that the approach
represents a competitive alternative to other email classifiers based
on the well-known SVM and naïve Bayes algorithms.
Abstract: An inverse problem of doubly center matrices is discussed. By translating the constrained problem into unconstrained problem, two iterative methods are proposed. A numerical example illustrate our algorithms.
Abstract: This paper represents four unsupervised clustering algorithms namely sIB, RandomFlatClustering, FarthestFirst, and FilteredClusterer that previously works have not been used for network traffic classification. The methodology, the result, the products of the cluster and evaluation of these algorithms with efficiency of each algorithm from accuracy are shown. Otherwise, the efficiency of these algorithms considering form the time that it use to generate the cluster quickly and correctly. Our work study and test the best algorithm by using classify traffic anomaly in network traffic with different attribute that have not been used before. We analyses the algorithm that have the best efficiency or the best learning and compare it to the previously used (K-Means). Our research will be use to develop anomaly detection system to more efficiency and more require in the future.
Abstract: The tracking allows to detect the tumor affections of cervical cancer, it is particularly complex and consuming time, because it consists in seeking some abnormal cells among a cluster of normal cells. In this paper, we present our proposed computer system for helping the doctors in tracking the cervical cancer. Knowing that the diagnosis of the malignancy is based in the set of atypical morphological details of all cells, herein, we present an unsupervised genetic algorithm for the separation of cell components since the diagnosis is doing by analysis of the core and the cytoplasm. We give also the various algorithms used for computing the morphological characteristics of cells (Ratio core/cytoplasm, cellular deformity, ...) necessary for the recognition of illness.
Abstract: In this paper, algorithms for the automatic localisation
of two anatomical soft tissue landmarks of the head the medial
canthus (inner corner of the eye) and the tragus (a small, pointed,
cartilaginous flap of the ear), in CT images are describet. These
landmarks are to be used as a basis for an automated image-to-patient
registration system we are developing. The landmarks are localised
on a surface model extracted from CT images, based on surface
curvature and a rule based system that incorporates prior knowledge
of the landmark characteristics. The approach was tested on a dataset
of near isotropic CT images of 95 patients. The position of the
automatically localised landmarks was compared to the position of
the manually localised landmarks. The average difference was 1.5
mm and 0.8 mm for the medial canthus and tragus, with a maximum
difference of 4.5 mm and 2.6 mm respectively.The medial canthus
and tragus can be automatically localised in CT images, with
performance comparable to manual localisation
Abstract: In this paper, a simple heuristic genetic algorithm is
used for Multistage Multiuser detection in fast fading environments.
Multipath channels, multiple access interference (MAI) and near far
effect cause the performance of the conventional detector to degrade.
Heuristic Genetic algorithms, a rapidly growing area of artificial
intelligence, uses evolutionary programming for initial search, which
not only helps to converge the solution towards near optimal
performance efficiently but also at a very low complexity as
compared with optimal detector. This holds true for Additive White
Gaussian Noise (AWGN) and multipath fading channels.
Experimental results are presented to show the superior performance
of the proposed techque over the existing methods.
Abstract: This paper describes WiPoD (Wireless Position
Detector) which is a pure software based location determination and
tracking (positioning) system. It uses empirical signal strength measurements from different wireless access points for mobile user
positioning. It is designed to determine the location of users having
802.11 enabled mobile devices in an 802.11 WLAN infrastructure
and track them in real time. WiPoD is the first main module in our
LBS (Location Based Services) framework. We tested K-Nearest
Neighbor and Triangulation algorithms to estimate the position of a
mobile user. We also give the analysis results of these algorithms for
real time operations. In this paper, we propose a supportable, i.e.
understandable, maintainable, scalable and portable wireless
positioning system architecture for an LBS framework. The WiPoD
software has a multithreaded structure and was designed and implemented with paying attention to supportability features and real-time constraints and using object oriented design principles. We also describe the real-time software design issues of a wireless positioning system which will be part of an LBS framework.
Abstract: In this paper, a comparative study of application of
supervised and unsupervised learning algorithms on illumination
invariant face recognition has been carried out. The supervised
learning has been carried out with the help of using a bi-layered
artificial neural network having one input, two hidden and one output
layer. The gradient descent with momentum and adaptive learning
rate back propagation learning algorithm has been used to implement
the supervised learning in a way that both the inputs and
corresponding outputs are provided at the time of training the
network, thus here is an inherent clustering and optimized learning of
weights which provide us with efficient results.. The unsupervised
learning has been implemented with the help of a modified
Counterpropagation network. The Counterpropagation network
involves the process of clustering followed by application of Outstar
rule to obtain the recognized face. The face recognition system has
been developed for recognizing faces which have varying
illumination intensities, where the database images vary in lighting
with respect to angle of illumination with horizontal and vertical
planes. The supervised and unsupervised learning algorithms have
been implemented and have been tested exhaustively, with and
without application of histogram equalization to get efficient results.