Abstract: The lifetime of a wireless sensor network can be
effectively increased by using scheduling operations. Once the
sensors are randomly deployed, the task at hand is to find the largest
number of disjoint sets of sensors such that every sensor set provides
complete coverage of the target area. At any instant, only one of these
disjoint sets is switched on, while all other are switched off. This
paper proposes a heuristic search method to find the maximum
number of disjoint sets that completely cover the region. A
population of randomly initialized members is made to explore the
solution space. A set of heuristics has been applied to guide the
members to a possible solution in their neighborhood. The heuristics
escalate the convergence of the algorithm. The best solution explored
by the population is recorded and is continuously updated. The
proposed algorithm has been tested for applications which require
sensing of multiple target points, referred to as point coverage
applications. Results show that the proposed algorithm outclasses the
existing algorithms. It always finds the optimum solution, and that
too by making fewer number of fitness function evaluations than the
existing approaches.
Abstract: This paper introduces symbiotic organism search (SOS)
for solving capacitated vehicle routing problem (CVRP). SOS is a new
approach in metaheuristics fields and never been used to solve discrete
problems. A sophisticated decoding method to deal with a discrete
problem setting in CVRP is applied using the basic symbiotic
organism search (SOS) framework. The performance of the algorithm
was evaluated on a set of benchmark instances and compared results
with best known solution. The computational results show that the
proposed algorithm can produce good solution as a preliminary
testing. These results indicated that the proposed SOS can be applied
as an alternative to solve the capacitated vehicle routing problem.
Abstract: Quality of Service (QoS) attributes as part of the
service description is an important factor for service attribute. It is not
easy to exactly quantify the weight of each QoS conditions since
human judgments based on their preference causes vagueness. As
web services selection requires optimization, evolutionary computing
based on heuristics to select an optimal solution is adopted. In this
work, the evolutionary computing technique Particle Swarm
Optimization (PSO) is used for selecting a suitable web services
based on the user’s weightage of each QoS values by optimizing the
QoS weight vector and thereby finding the best weight vectors for
best services that is being selected. Finally the results are compared
and analyzed using static inertia weight and deterministic inertia
weight of PSO.
Abstract: In this paper, performances of shuffled frog leaping
algorithm was investigated on the stealth laser dicing process. Effect
of problem on the performance of the algorithm was based on the
tolerance of meandering data. From the customer specification it
could be less than five microns with the target of zero microns.
Currently, the meandering levels are unsatisfactory when compared
to the customer specification. Firstly, the two-level factorial design
was applied to preliminarily study the statistically significant effects
of five process variables. In this study one influential process variable
is integer. From the experimental results, the new operating condition
from the algorithm was superior when compared to the current
manufacturing condition.
Abstract: In this paper, we consider the vehicle routing problem
with mixed fleet of conventional and heterogenous electric vehicles
and time dependent charging costs, denoted VRP-HFCC, in which
a set of geographically scattered customers have to be served by a
mixed fleet of vehicles composed of a heterogenous fleet of Electric
Vehicles (EVs), having different battery capacities and operating
costs, and Conventional Vehicles (CVs). We include the possibility
of charging EVs in the available charging stations during the routes
in order to serve all customers. Each charging station offers charging
service with a known technology of chargers and time dependent
charging costs. Charging stations are also subject to operating time
windows constraints. EVs are not necessarily compatible with all
available charging technologies and a partial charging is allowed.
Intermittent charging at the depot is also allowed provided that
constraints related to the electricity grid are satisfied.
The objective is to minimize the number of employed vehicles and
then minimize the total travel and charging costs.
In this study, we present a Mixed Integer Programming Model and
develop a Charging Routing Heuristic and a Local Search Heuristic
based on the Inject-Eject routine with different insertion methods. All
heuristics are tested on real data instances.
Abstract: In this paper, we propose a new packing strategy to
find a free resource for run-time mapping of application tasks to
NoC-based Heterogeneous MPSoC. The proposed strategy minimizes
the task mapping time in addition to placing the communicating tasks
close to each other. To evaluate our approach, a comparative study is
carried out for a platform containing single task supported PEs.
Experiments show that our strategy provides better results when
compared to latest dynamic mapping strategies reported in the
literature.
Abstract: This paper is concerned with minimization of mean
tardiness and flow time in a real single machine production
scheduling problem. Two variants of genetic algorithm as metaheuristic
are combined with hyper-heuristic approach are proposed to
solve this problem. These methods are used to solve instances
generated with real world data from a company. Encouraging results
are reported.
Abstract: The integrated problem of production and distribution scheduling is relevant in many industrial applications. Thus, many heuristics to solve this integrated problem have been developed in the last decade. Most of these heuristics use a sequential working principal or a single decomposition and integration approach to separate and solve subproblems. A heuristic using a multi step decomposition and integration approach is presented in this paper and evaluated in a case study. The result show significant improved results compared with sequential scheduling heuristics.
Abstract: The website developer and designer should follow usability guidelines to provide a user-friendly interface. Many guidelines and heuristics have been developed by previous studies to help both the developer and designer in this task, but E-government websites are special cases that require specialized guidelines. This paper introduces a set of 18 guidelines for evaluating the usability of e-government websites in general and Arabic e-government websites specifically, along with a check list of how to apply them. The validity and effectiveness of these guidelines were evaluated against a variety of user characteristics. The results indicated that the proposed set of guidelines can be used to identify qualitative similarities and differences with user testing and that the new set is best suited for evaluating general and e-governmental usability.
Abstract: We model the process of a data center as a multi- objective problem of mapping independent tasks onto a set of data center machines that simultaneously minimizes the energy consump¬tion and response time (makespan) subject to the constraints of deadlines and architectural requirements. A simple technique based on multi-objective goal programming is proposed that guarantees Pareto optimal solution with excellence in convergence process. The proposed technique also is compared with other traditional approach. The simulation results show that the proposed technique achieves superior performance compared to the min-min heuristics, and com¬petitive performance relative to the optimal solution implemented in UNDO for small-scale problems.
Abstract: The travelling salesman problem (TSP) is a combinatorial optimization problem in which the goal is to find the shortest path between different cities that the salesman takes. In other words, the problem deals with finding a route covering all cities so that total distance and execution time is minimized. This paper adopts the nearest neighbor and minimum spanning tree algorithm to solve the well-known travelling salesman problem. The algorithms were implemented using java programming language. The approach is tested on three graphs that making a TSP tour instance of 5-city, 10 –city, and 229–city. The computation results validate the performance of the proposed algorithm.
Abstract: Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.
Abstract: The problems with high complexity had been the challenge in combinatorial problems. Due to the none-determined and polynomial characteristics, these problems usually face to unreasonable searching budget. Hence combinatorial optimizations attracted numerous researchers to develop better algorithms. In recent academic researches, most focus on developing to enhance the conventional evolutional algorithms and facilitate the local heuristics, such as VNS, 2-opt and 3-opt. Despite the performances of the introduction of the local strategies are significant, however, these improvement cannot improve the performance for solving the different problems. Therefore, this research proposes a meta-heuristic evolutional algorithm which can be applied to solve several types of problems. The performance validates BBEA has the ability to solve the problems even without the design of local strategies.
Abstract: In the present study the efficiency of Big Bang-Big
Crunch (BB-BC) algorithm is investigated in discrete structural
design optimization. It is shown that a standard version of the BB-BC
algorithm is sometimes unable to produce reasonable solutions to
problems from discrete structural design optimization. Two
reformulations of the algorithm, which are referred to as modified
BB-BC (MBB-BC) and exponential BB-BC (EBB-BC), are
introduced to enhance the capability of the standard algorithm in
locating good solutions for steel truss and frame type structures,
respectively. The performances of the proposed algorithms are
experimented and compared to its standard version as well as some
other algorithms over several practical design examples. In these
examples, steel structures are sized for minimum weight subject to
stress, stability and displacement limitations according to the
provisions of AISC-ASD.
Abstract: This paper presents an algorithm of particle swarm
optimization with reduction for global optimization problems. Particle
swarm optimization is an algorithm which refers to the collective
motion such as birds or fishes, and a multi-point search algorithm
which finds a best solution using multiple particles. Particle
swarm optimization is so flexible that it can adapt to a number
of optimization problems. When an objective function has a lot of
local minimums complicatedly, the particle may fall into a local
minimum. For avoiding the local minimum, a number of particles are
initially prepared and their positions are updated by particle swarm
optimization. Particles sequentially reduce to reach a predetermined
number of them grounded in evaluation value and particle swarm
optimization continues until the termination condition is met. In order
to show the effectiveness of the proposed algorithm, we examine the
minimum by using test functions compared to existing algorithms.
Furthermore the influence of best value on the initial number of
particles for our algorithm is discussed.
Abstract: A procedure commonly used in Job Shop Scheduling Problem (JSSP) to evaluate the neighborhoods functions that use the non-deterministic algorithms is the calculation of the critical path in a digraph. This paper presents an experimental study of the cost of computation that exists when the calculation of the critical path in the solution for instances in which a JSSP of large size is involved. The results indicate that if the critical path is use in order to generate neighborhoods in the meta-heuristics that are used in JSSP, an elevated cost of computation exists in spite of the fact that the calculation of the critical path in any digraph is of polynomial complexity.
Abstract: One main drawback of intrusion detection system is the
inability of detecting new attacks which do not have known
signatures. In this paper we discuss an intrusion detection method
that proposes independent component analysis (ICA) based feature
selection heuristics and using rough fuzzy for clustering data. ICA is
to separate these independent components (ICs) from the monitored
variables. Rough set has to decrease the amount of data and get rid of
redundancy and Fuzzy methods allow objects to belong to several
clusters simultaneously, with different degrees of membership. Our
approach allows us to recognize not only known attacks but also to
detect activity that may be the result of a new, unknown attack. The
experimental results on Knowledge Discovery and Data Mining-
(KDDCup 1999) dataset.
Abstract: The hybridisation of genetic algorithm with heuristics has been shown to be one of an effective way to improve its performance. In this work, genetic algorithm hybridised with four heuristics including a new heuristic called neighbourhood improvement were investigated through the classical travelling salesman problem. The experimental results showed that the proposed heuristic outperformed other heuristics both in terms of quality of the results obtained and the computational time.
Abstract: The information on the Web increases tremendously.
A number of search engines have been developed for searching Web
information and retrieving relevant documents that satisfy the
inquirers needs. Search engines provide inquirers irrelevant
documents among search results, since the search is text-based rather
than semantic-based. Information retrieval research area has
presented a number of approaches and methodologies such as
profiling, feedback, query modification, human-computer interaction,
etc for improving search results. Moreover, information retrieval has
employed artificial intelligence techniques and strategies such as
machine learning heuristics, tuning mechanisms, user and system
vocabularies, logical theory, etc for capturing user's preferences and
using them for guiding the search based on the semantic analysis
rather than syntactic analysis. Although a valuable improvement has
been recorded on search results, the survey has shown that still
search engines users are not really satisfied with their search results.
Using ontologies for semantic-based searching is likely the key
solution. Adopting profiling approach and using ontology base
characteristics, this work proposes a strategy for finding the exact
meaning of the query terms in order to retrieve relevant information
according to user needs. The evaluation of conducted experiments
has shown the effectiveness of the suggested methodology and
conclusion is presented.
Abstract: Feature selection has recently been the subject of intensive research in data mining, specially for datasets with a large number of attributes. Recent work has shown that feature selection can have a positive effect on the performance of machine learning algorithms. The success of many learning algorithms in their attempts to construct models of data, hinges on the reliable identification of a small set of highly predictive attributes. The inclusion of irrelevant, redundant and noisy attributes in the model building process phase can result in poor predictive performance and increased computation. In this paper, a novel feature search procedure that utilizes the Ant Colony Optimization (ACO) is presented. The ACO is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It looks for optimal solutions by considering both local heuristics and previous knowledge. When applied to two different classification problems, the proposed algorithm achieved very promising results.