Abstract: Next Generation Wireless Network (NGWN) is
expected to be a heterogeneous network which integrates all different
Radio Access Technologies (RATs) through a common platform. A
major challenge is how to allocate users to the most suitable RAT for
them. An optimized solution can lead to maximize the efficient use
of radio resources, achieve better performance for service providers
and provide Quality of Service (QoS) with low costs to users.
Currently, Radio Resource Management (RRM) is implemented
efficiently for the RAT that it was developed. However, it is not
suitable for a heterogeneous network. Common RRM (CRRM) was
proposed to manage radio resource utilization in the heterogeneous
network. This paper presents a user level Markov model for a three
co-located RAT networks. The load-balancing based and service
based CRRM algorithms have been studied using the presented
Markov model. A comparison for the performance of load-balancing
based and service based CRRM algorithms is studied in terms of
traffic distribution, new call blocking probability, vertical handover
(VHO) call dropping probability and throughput.
Abstract: Methods of clustering which were developed in the
data mining theory can be successfully applied to the investigation of
different kinds of dependencies between the conditions of
environment and human activities. It is known, that environmental
parameters such as temperature, relative humidity, atmospheric
pressure and illumination have significant effects on the human
mental performance. To investigate these parameters effect, data
mining technique of clustering using entropy and Information Gain
Ratio (IGR) K(Y/X) = (H(X)–H(Y/X))/H(Y) is used, where
H(Y)=-ΣPi ln(Pi). This technique allows adjusting the boundaries of
clusters. It is shown that the information gain ratio (IGR) grows
monotonically and simultaneously with degree of connectivity
between two variables. This approach has some preferences if
compared, for example, with correlation analysis due to relatively
smaller sensitivity to shape of functional dependencies. Variant of an
algorithm to implement the proposed method with some analysis of
above problem of environmental effects is also presented. It was
shown that proposed method converges with finite number of steps.
Abstract: The set of all abelian subalgebras is computationally
obtained for any given finite-dimensional Lie algebra, starting from the nonzero brackets in its law. More concretely, an algorithm
is described and implemented to compute a basis for each nontrivial abelian subalgebra with the help of the symbolic computation package MAPLE. Finally, it is also shown a brief computational study
for this implementation, considering both the computing time and the
used memory.
Abstract: A highly optimized implementation of binary mixture
diffusion with no initial bulk velocity on graphics processors is
presented. The lattice Boltzmann model is employed for simulating
the binary diffusion of oxygen and nitrogen into each other with
different initial concentration distributions. Simulations have been
performed using the latest proposed lattice Boltzmann model that
satisfies both the indifferentiability principle and the H-theorem for
multi-component gas mixtures. Contemporary numerical
optimization techniques such as memory alignment and increasing
the multiprocessor occupancy are exploited along with some novel
optimization strategies to enhance the computational performance on
graphics processors using the C for CUDA programming language.
Speedup of more than two orders of magnitude over single-core
processors is achieved on a variety of Graphical Processing Unit
(GPU) devices ranging from conventional graphics cards to
advanced, high-end GPUs, while the numerical results are in
excellent agreement with the available analytical and numerical data
in the literature.
Abstract: This paper presents a new method which applies an
artificial bee colony algorithm (ABC) for capacitor placement in
distribution systems with an objective of improving the voltage profile
and reduction of power loss. The ABC algorithm is a new population
based meta heuristic approach inspired by intelligent foraging behavior
of honeybee swarm. The advantage of ABC algorithm is that
it does not require external parameters such as cross over rate and
mutation rate as in case of genetic algorithm and differential evolution
and it is hard to determine these parameters in prior. The other
advantage is that the global search ability in the algorithm is implemented
by introducing neighborhood source production mechanism
which is a similar to mutation process. To demonstrate the validity
of the proposed algorithm, computer simulations are carried out on
69-bus system and compared the results with the other approach
available in the literature. The proposed method has outperformed the
other methods in terms of the quality of solution and computational
efficiency.
Abstract: In this paper, we proposed a novel receiver algorithm
for coherent underwater acoustic communications. The proposed
receiver is composed of three parts: (1) Doppler tracking and
correction, (2) Time reversal channel estimation and combining, and
(3) Joint iterative equalization and decoding (JIED). To reduce
computational complexity and optimize the equalization algorithm,
Time reversal (TR) channel estimation and combining is adopted to
simplify multi-channel adaptive decision feedback equalizer (ADFE)
into single channel ADFE without reducing the system performance.
Simultaneously, the turbo theory is adopted to form joint iterative
ADFE and convolutional decoder (JIED). In JIED scheme, the ADFE
and decoder exchange soft information in an iterative manner, which
can enhance the equalizer performance using decoding gain. The
simulation results show that the proposed algorithm can reduce
computational complexity and improve the performance of equalizer.
Therefore, the performance of coherent underwater acoustic
communications can be improved greatly.
Abstract: The equivalence class subset algorithm is a powerful
tool for solving a wide variety of constraint satisfaction problems and
is based on the use of a decision function which has a very high but
not perfect accuracy. Perfect accuracy is not required in the decision
function as even a suboptimal solution contains valuable information
that can be used to help find an optimal solution. In the hardest
problems, the decision function can break down leading to a
suboptimal solution where there are more equivalence classes than
are necessary and which can be viewed as a mixture of good decision
and bad decisions. By choosing a subset of the decisions made in
reaching a suboptimal solution an iterative technique can lead to an
optimal solution, using series of steadily improved suboptimal
solutions. The goal is to reach an optimal solution as quickly as
possible. Various techniques for choosing the decision subset are
evaluated.
Abstract: The lack of any centralized infrastructure in mobile ad
hoc networks (MANET) is one of the greatest security concerns in
the deployment of wireless networks. Thus communication in
MANET functions properly only if the participating nodes cooperate
in routing without any malicious intention. However, some of the
nodes may be malicious in their behavior, by indulging in flooding
attacks on their neighbors. Some others may act malicious by
launching active security attacks like denial of service. This paper
addresses few related works done on trust evaluation and
establishment in ad hoc networks. Related works on flooding attack
prevention are reviewed. A new trust approach based on the extent of
friendship between the nodes is proposed which makes the nodes to
co-operate and prevent flooding attacks in an ad hoc environment.
The performance of the trust algorithm is tested in an ad hoc network
implementing the Ad hoc On-demand Distance Vector (AODV)
protocol.
Abstract: In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.
Abstract: A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.
Abstract: Economic Load Dispatch (ELD) is a method of determining
the most efficient, low-cost and reliable operation of a power
system by dispatching available electricity generation resources to
supply load on the system. The primary objective of economic
dispatch is to minimize total cost of generation while honoring
operational constraints of available generation resources. In this paper
an intelligent water drop (IWD) algorithm has been proposed to
solve ELD problem with an objective of minimizing the total cost of
generation. Intelligent water drop algorithm is a swarm-based natureinspired
optimization algorithm, which has been inspired from natural
rivers. A natural river often finds good paths among lots of possible
paths in its ways from source to destination and finally find almost
optimal path to their destination. These ideas are embedded into
the proposed algorithm for solving economic load dispatch problem.
The main advantage of the proposed technique is easy is implement
and capable of finding feasible near global optimal solution with
less computational effort. In order to illustrate the effectiveness of
the proposed method, it has been tested on 6-unit and 20-unit test
systems with incremental fuel cost functions taking into account the
valve point-point loading effects. Numerical results shows that the
proposed method has good convergence property and better in quality
of solution than other algorithms reported in recent literature.
Abstract: This research elaborates decision models for product
innovation in the early phases, focusing on one of the most widely
implemented method in marketing research: conjoint analysis and the
related conjoint-based models with special focus on heuristics
programming techniques for the development of optimal product
innovation. The concept, potential, requirements and limitations of
conjoint analysis and its conjoint-based heuristics successors are
analysed and the development of conceptual framework of Genetic
Algorithm (GA) as one of the most widely implemented heuristic
methods for developing product innovations are discussed.
Abstract: In this paper an ant colony optimization algorithm is
developed to solve the permutation flow shop scheduling problem. In
the permutation flow shop scheduling problem which has been vastly
studied in the literature, there are a set of m machines and a set of n
jobs. All the jobs are processed on all the machines and the sequence
of jobs being processed is the same on all the machines. Here this
problem is optimized considering two criteria, makespan and total
flow time. Then the results are compared with the ones obtained by
previously developed algorithms. Finally it is visible that our
proposed approach performs best among all other algorithms in the
literature.
Abstract: We propose our genuine research of geometric
moments which detects the mineral inadequacy in the frail groundnut
plant. This plant is prone to many deficiencies as a result of the
variance in the soil nutrients. By analyzing the leaves of the plant, we
detect the visual symptoms that are not recognizable to the naked eyes.
We have collected about 160 samples of leaves from the nearby fields.
The images have been taken by keeping every leaf into a black box to
avoid the external interference. For the first time, it has been possible
to provide the farmer with the stages of deficiencies. This paper has
applied the algorithms successfully to many other plants like Lady-s
finger, Green Bean, Lablab Bean, Chilli and Tomato. But we submit
the results of the groundnut predominantly. The accuracy of our
algorithm and method is almost 93%. This will again pioneer a kind of
green revolution in the field of agriculture and will be a boon to that
field.
Abstract: This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.
Abstract: This paper presents a systematic approach for designing Unified Power Flow Controller (UPFC) based supplementary damping controllers for damping low frequency oscillations in a single-machine infinite-bus power system. Detailed investigations have been carried out considering the four alternatives UPFC based damping controller namely modulating index of series inverter (mB), modulating index of shunt inverter (mE), phase angle of series inverter (δB ) and phase angle of the shunt inverter (δE ). The design problem of the proposed controllers is formulated as an optimization problem and Real- Coded Genetic Algorithm (RCGA) is employed to optimize damping controller parameters. Simulation results are presented and compared with a conventional method of tuning the damping controller parameters to show the effectiveness and robustness of the proposed design approach.
Abstract: Stock portfolio selection is a classic problem in finance,
and it involves deciding how to allocate an institution-s or an individual-s
wealth to a number of stocks, with certain investment objectives
(return and risk). In this paper, we adopt the classical Markowitz
mean-variance model and consider an additional common realistic
constraint, namely, the cardinality constraint. Thus, stock portfolio
optimization becomes a mixed-integer quadratic programming problem
and it is difficult to be solved by exact optimization algorithms.
Chemical Reaction Optimization (CRO), which mimics the molecular
interactions in a chemical reaction process, is a population-based
metaheuristic method. Two different types of CRO, named canonical
CRO and Super Molecule-based CRO (S-CRO), are proposed to solve
the stock portfolio selection problem. We test both canonical CRO
and S-CRO on a benchmark and compare their performance under
two criteria: Markowitz efficient frontier (Pareto frontier) and Sharpe
ratio. Computational experiments suggest that S-CRO is promising
in handling the stock portfolio optimization problem.
Abstract: This paper objects to extend Jon Kleinberg-s research. He introduced the structure of small-world in a grid and shows with a greedy algorithm using only local information able to find route between source and target in delivery time O(log2n). His fundamental model for distributed system uses a two-dimensional grid with longrange random links added between any two node u and v with a probability proportional to distance d(u,v)-2. We propose with an additional information of the long link nearby, we can find the shorter path. We apply the ant colony system as a messenger distributed their pheromone, the long-link details, in surrounding area. The subsequence forwarding decision has more option to move to, select among local neighbors or send to node has long link closer to its target. Our experiment results sustain our approach, the average routing time by Color Pheromone faster than greedy method.
Abstract: In this paper we propose new method for
simultaneous generating multiple quantiles corresponding to given
probability levels from data streams and massive data sets. This
method provides a basis for development of single-pass low-storage
quantile estimation algorithms, which differ in complexity, storage
requirement and accuracy. We demonstrate that such algorithms may
perform well even for heavy-tailed data.
Abstract: The development of Artificial Neural Networks
(ANNs) is usually a slow process in which the human expert has to
test several architectures until he finds the one that achieves best
results to solve a certain problem. This work presents a new
technique that uses Genetic Programming (GP) for automatically
generating ANNs. To do this, the GP algorithm had to be changed in
order to work with graph structures, so ANNs can be developed. This
technique also allows the obtaining of simplified networks that solve
the problem with a small group of neurons. In order to measure the
performance of the system and to compare the results with other
ANN development methods by means of Evolutionary Computation
(EC) techniques, several tests were performed with problems based
on some of the most used test databases. The results of those
comparisons show that the system achieves good results comparable
with the already existing techniques and, in most of the cases, they
worked better than those techniques.