Abstract: The median problem is significantly applied to derive
the most reasonable rearrangement phylogenetic tree for many
species. More specifically, the problem is concerned with finding
a permutation that minimizes the sum of distances between itself
and a set of three signed permutations. Genomes with equal number
of genes but different order can be represented as permutations.
In this paper, an algorithm, namely BeamGA median, is proposed
that combines a heuristic search approach (local beam) as an
initialization step to generate a number of solutions, and then a
Genetic Algorithm (GA) is applied in order to refine the solutions,
aiming to achieve a better median with the smallest possible reversal
distance from the three original permutations. In this approach,
any genome rearrangement distance can be applied. In this paper,
we use the reversal distance. To the best of our knowledge, the
proposed approach was not applied before for solving the median
problem. Our approach considers true biological evolution scenario
by applying the concept of common intervals during the GA
optimization process. This allows us to imitate a true biological
behavior and enhance genetic approach time convergence. We were
able to handle permutations with a large number of genes, within
an acceptable time performance and with same or better accuracy as
compared to existing algorithms.
Abstract: In this paper, we study a distributed control algorithm
for the problem of unknown area coverage by a network of robots.
The coverage objective is to locate a set of targets in the area and
to minimize the robots’ energy consumption. The robots have no
prior knowledge about the location and also about the number of the
targets in the area. One efficient approach that can be used to relax
the robots’ lack of knowledge is to incorporate an auxiliary learning
algorithm into the control scheme. A learning algorithm actually
allows the robots to explore and study the unknown environment
and to eventually overcome their lack of knowledge. The control
algorithm itself is modeled based on game theory where the network
of the robots use their collective information to play a non-cooperative
potential game. The algorithm is tested via simulations to verify its
performance and adaptability.
Abstract: Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.
Abstract: The biological function of an RNA molecule depends
on its structure. The objective of the alignment is finding the
homology between two or more RNA secondary structures. Knowing
the common functionalities between two RNA structures allows
a better understanding and a discovery of other relationships
between them. Besides, identifying non-coding RNAs -that is not
translated into a protein- is a popular application in which RNA
structural alignment is the first step A few methods for RNA
structure-to-structure alignment have been developed. Most of these
methods are partial structure-to-structure, sequence-to-structure, or
structure-to-sequence alignment. Less attention is given in the
literature to the use of efficient RNA structure representation and the
structure-to-structure alignment methods are lacking. In this paper,
we introduce an O(N2) Component-based Pairwise RNA Structure
Alignment (CompPSA) algorithm, where structures are given as
a component-based representation and where N is the maximum
number of components in the two structures. The proposed algorithm
compares the two RNA secondary structures based on their weighted
component features rather than on their base-pair details. Extensive
experiments are conducted illustrating the efficiency of the CompPSA
algorithm when compared to other approaches and on different real
and simulated datasets. The CompPSA algorithm shows an accurate
similarity measure between components. The algorithm gives the
flexibility for the user to align the two RNA structures based on
their weighted features (position, full length, and/or stem length).
Moreover, the algorithm proves scalability and efficiency in time and
memory performance.
Abstract: This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.
Abstract: This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.
Abstract: This paper applies to the creation of virtual 3D environments for the study and development of mobile robot image based navigation algorithms and techniques, which need to operate robustly and efficiently. The test of these algorithms can be performed in a physical way, from conducting experiments on a prototype, or by numerical simulations. Current simulation platforms for robotic applications do not have flexible and updated models for image rendering, being unable to reproduce complex light effects and materials. Thus, it is necessary to create a test platform that integrates sophisticated simulated applications of real environments for navigation, with data and image processing. This work proposes the development of a high-level platform for building 3D model’s environments and the test of image-based navigation algorithms for mobile robots. Techniques were used for applying texture and lighting effects in order to accurately represent the generation of rendered images regarding the real world version. The application will integrate image processing scripts, trajectory control, dynamic modeling and simulation techniques for physics representation and picture rendering with the open source 3D creation suite - Blender.
Abstract: A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.
Abstract: Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.
Abstract: In recent years, the combined economic and emission power dispatch is one of the main problems of electrical power system. It aims to schedule the power generation of generators in order to minimize cost production and emission of harmful gases caused by fossil-fueled thermal units such as CO, CO2, NOx, and SO2. To solve this complicated multi-objective problem, an improved version of the particle swarm optimization technique that includes non-dominated sorting concept has been proposed. Valve point loading effects and system losses have been considered. The three-unit and ten-unit benchmark systems have been used to show the effectiveness of the suggested optimization technique for solving this kind of nonconvex problem. The simulation results have been compared with those obtained using genetic algorithm based method. Comparison results show that the proposed approach can provide a higher quality solution with better performance.
Abstract: Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.
Abstract: The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.
Abstract: Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the implementation of a robust system that is capable of locating, tracking mobile device users and store location information for both indoor and partially outdoor the cloud. The system can be used during disaster to track and locate mobile phone users. The developed system is a mobile application built based on Android, Hypertext Preprocessor (PHP), Cascading Style Sheets (CSS), JavaScript and MATLAB for the Android mobile users. Using Waterfall model of software development, we have implemented a three level system that is able to track, locate and store mobile device information in secure database (cloud) on almost a real time basis. The outcome of the study showed that the developed system is efficient with regard to the tracking and locating mobile devices. The system is also flexible, i.e. can be used in any building with fewer adjustments. Finally, the system is accurate for both indoor and outdoor in terms of locating and tracking mobile devices.
Abstract: Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.
Abstract: The objective of this study is to examine the performance of three well-known multiobjective evolutionary algorithms for solving optimization problems. The first algorithm is the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), the second one is the Strength Pareto Evolutionary Algorithm 2 (SPEA-2), and the third one is the Multiobjective Evolutionary Algorithms based on decomposition (MOEA/D). The examined multiobjective algorithms are analyzed and tested on the ZDT set of test functions by three performance metrics. The results indicate that the NSGA-II performs better than the other two algorithms based on three performance metrics.
Abstract: Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.
Abstract: Several computationally challenging issues are
encountered while classifying complex natural scenes. In this
paper, we address the problems that are encountered in rotation
invariance with multi-intensity analysis for multi-scene overlapping.
In the present literature, various algorithms proposed techniques
for multi-intensity analysis, but there are several restrictions in
these algorithms while deploying them in multi-scene overlapping
classifications. In order to resolve the problem of multi-scenery
overlapping classifications, we present a framework that is based
on macro and micro basis functions. This algorithm conquers the
minimum classification false alarm while pigeonholing multi-scene
overlapping. Furthermore, a quadrangle multi-intensity decay is
invoked. Several parameters are utilized to analyze invariance
for multi-scenery classifications such as rotation, classification,
correlation, contrast, homogeneity, and energy. Benchmark datasets
were collected for complex natural scenes and experimented for
the framework. The results depict that the framework achieves
a significant improvement on gray-level matrix of co-occurrence
features for overlapping in diverse degree of orientations while
pigeonholing multi-scene overlapping.
Abstract: With the increase in popularity of mobile devices,
new and varied forms of malware have emerged. Consequently,
the organizations for cyberdefense have echoed the need to deploy
more effective defensive schemes adapted to the challenges posed
by these recent monitoring environments. In order to contribute to
their development, this paper presents a malware detection strategy
for mobile devices based on sequence alignment algorithms. Unlike
the previous proposals, only the system calls performed during the
startup of applications are studied. In this way, it is possible to
efficiently study in depth, the sequences of system calls executed
by the applications just downloaded from app stores, and initialize
them in a secure and isolated environment. As demonstrated in the
performed experimentation, most of the analyzed malicious activities
were successfully identified in their boot processes.
Abstract: The benchmarking of tools for dynamic analysis of
vulnerabilities in web applications is something that is done
periodically, because these tools from time to time update their
knowledge base and search algorithms, in order to improve their
accuracy. Unfortunately, the vast majority of these evaluations are
made by software enthusiasts who publish their results on blogs
or on non-academic websites and always with the same evaluation
methodology. Similarly, academics who have carried out this type of
analysis from a scientific approach, the majority, make their analysis
within the same methodology as well the empirical authors. This
paper is based on the interest of finding answers to questions that
many users of this type of tools have been asking over the years,
such as, to know if the tool truly test and evaluate every vulnerability
that it ensures do, or if the tool, really, deliver a real report of all the
vulnerabilities tested and exploited. This kind of questions have also
motivated previous work but without real answers. The aim of this
paper is to show results that truly answer, at least on the tested tools,
all those unanswered questions. All the results have been obtained
by changing the common model of benchmarking used for all those
previous works.
Abstract: An increasing number of mobile devices with integrated
cameras has meant that most digital video comes from these devices.
These digital videos can be made anytime, anywhere and for different
purposes. They can also be shared on the Internet in a short period
of time and may sometimes contain recordings of illegal acts. The
need to reliably trace the origin becomes evident when these videos
are used for forensic purposes. This work proposes an algorithm
to identify the brand and model of mobile device which generated
the video. Its procedure is as follows: after obtaining the relevant
video information, a classification algorithm based on sensor noise
and Wavelet Transform performs the aforementioned identification
process. We also present experimental results that support the validity
of the techniques used and show promising results.