Abstract: Like other external sorting algorithms, the presented
algorithm is a two step algorithm including internal and external
steps. The first part of the algorithm is like the other similar
algorithms but second part of that is including a new easy
implementing method which has reduced the vast number of inputoutput
operations saliently. As decreasing processor operating time
does not have any effect on main algorithm speed, any improvement
in it should be done through decreasing the number of input-output
operations. This paper propose an easy algorithm for choose the
correct record location of the final list. This decreases the time
complexity and makes the algorithm faster.
Abstract: Measurements of capacitance C and dissipation
factor tand of the stator insulation system provide useful information
about internal defects within the insulation. The index k is defined as
the proportionality constant between the changes at high voltage of
capacitance DC and of the dissipation factor Dtand . DC and
Dtand values were highly correlated when small flat defects were
within the insulation and that correlation was lost in the presence of
large narrow defects like electrical treeing. The discrimination
between small and large defects is made resorting to partial discharge
PD phase angle analysis. For the validation of the results, C and tand
measurements were carried out in a 15MVA 4160V steam turbine
turbogenerator placed in a sugar mill. In addition, laboratory test
results obtained by other authors were analyzed jointly. In such
laboratory tests, model coil bars subjected to thermal cycling resulted
highly degraded and DC and Dtand values were not correlated. Thus,
the index k could not be calculated.
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Abstract: This paper presents a new technique for the optimum
placement of processors to minimize the total effective
communication load under multi-processor communication
dominated environment. This is achieved by placing heavily loaded
processors near each other and lightly loaded ones far away from
one another in the physical grid locations. The results are
mathematically proved for the Algorithms are described.
Abstract: The shortest path routing problem is a multiobjective
nonlinear optimization problem with constraints. This problem has
been addressed by considering Quality of service parameters, delay
and cost objectives separately or as a weighted sum of both
objectives. Multiobjective evolutionary algorithms can find multiple
pareto-optimal solutions in one single run and this ability makes them
attractive for solving problems with multiple and conflicting
objectives. This paper uses an elitist multiobjective evolutionary
algorithm based on the Non-dominated Sorting Genetic Algorithm
(NSGA), for solving the dynamic shortest path routing problem in
computer networks. A priority-based encoding scheme is proposed
for population initialization. Elitism ensures that the best solution
does not deteriorate in the next generations. Results for a sample test
network have been presented to demonstrate the capabilities of the
proposed approach to generate well-distributed pareto-optimal
solutions of dynamic routing problem in one single run. The results
obtained by NSGA are compared with single objective weighting
factor method for which Genetic Algorithm (GA) was applied.
Abstract: A method is presented for the construction of arbitrary
even-input sorting networks exhibiting better properties than the
networks created using a conventional technique of the same type.
The method was discovered by means of a genetic algorithm combined
with an application-specific development. Similarly to human
inventions in the area of theoretical computer science, the evolved
invention was analyzed: its generality was proven and area and time
complexities were determined.
Abstract: Because of importance of energy, optimization of
power generation systems is necessary. Gas turbine cycles are
suitable manner for fast power generation, but their efficiency is
partly low. In order to achieving higher efficiencies, some
propositions are preferred such as recovery of heat from exhaust
gases in a regenerator, utilization of intercooler in a multistage
compressor, steam injection to combustion chamber and etc.
However thermodynamic optimization of gas turbine cycle, even
with above components, is necessary. In this article multi-objective
genetic algorithms are employed for Pareto approach optimization of
Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective
optimization a number of conflicting objective functions
are to be optimized simultaneously. The important objective
functions that have been considered for optimization are entropy
generation of RIGT cycle (Ns) derives using Exergy Analysis and
Gouy-Stodola theorem, thermal efficiency and the net output power
of RIGT Cycle. These objectives are usually conflicting with each
other. The design variables consist of thermodynamic parameters
such as compressor pressure ratio (Rp), excess air in combustion
(EA), turbine inlet temperature (TIT) and inlet air temperature (T0).
At the first stage single objective optimization has been investigated
and the method of Non-dominated Sorting Genetic Algorithm
(NSGA-II) has been used for multi-objective optimization.
Optimization procedures are performed for two and three objective
functions and the results are compared for RIGT Cycle. In order to
investigate the optimal thermodynamic behavior of two objectives,
different set, each including two objectives of output parameters, are
considered individually. For each set Pareto front are depicted. The
sets of selected decision variables based on this Pareto front, will
cause the best possible combination of corresponding objective
functions. There is no superiority for the points on the Pareto front
figure, but they are superior to any other point. In the case of three
objective optimization the results are given in tables.
Abstract: Sorting appears the most attention among all computational tasks over the past years because sorted data is at the heart of many computations. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. Many parallel sorting algorithms have been investigated for a variety of parallel computer architectures. In this paper, three parallel sorting algorithms have been implemented and compared in terms of their overall execution time. The algorithms implemented are the odd-even transposition sort, parallel merge sort and parallel rank sort. Cluster of Workstations or Windows Compute Cluster has been used to compare the algorithms implemented. The C# programming language is used to develop the sorting algorithms. The MPI (Message Passing Interface) library has been selected to establish the communication and synchronization between processors. The time complexity for each parallel sorting algorithm will also be mentioned and analyzed.
Abstract: Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the elements being sorted to minimize the complexity and time as compared to insertion sort. Shell sort improves the efficiency of insertion sort by quickly shifting values to their destination. Average sort time is O(n1.25), while worst-case time is O(n1.5). It performs certain iterations. In each iteration it swaps some elements of the array in such a way that in last iteration when the value of h is one, the number of swaps will be reduced. Donald L. Shell invented a formula to calculate the value of ?h?. this work focuses to identify some improvement in the conventional Shell sort algorithm. ''Enhanced Shell Sort algorithm'' is an improvement in the algorithm to calculate the value of 'h'. It has been observed that by applying this algorithm, number of swaps can be reduced up to 60 percent as compared to the existing algorithm. In some other cases this enhancement was found faster than the existing algorithms available.
Abstract: Nowadays there are lots of applications of power and
free conveyors in logistics. They are the most frequently used
conveyor systems worldwide. Overhead conveyor technologies like
power and free systems are used in the most intra-logistics
applications in trade and industry. The automotive, food, beverage
and textile industry as well as aeronautic catering or engineering are
among the applications. Power and free systems employ different
manufacturing intervals in manufacturing as well as in production as
temporary store and buffer. Depending on the application area, power
and free conveyors are equipped with target controls enabling
complex distribution-and sorting tasks. This article introduces a new
power and free conveyor design in intra-logistics and explains its
components. According to the explanation of the components, a
model is created by means of their technical characteristics. Through
the CAD software, the model is visualized. After that, the static
analysis is evaluated. This analysis helps the calculation of the
mandatory state of structures under force action. This powerful model
helps companies achieve lower development costs as well as quicker
market maturity.
Abstract: The most reliable and accurate description of the actual behavior of a software system is its source code. However, not all questions about the system can be answered directly by resorting to this repository of information. What the reverse engineering methodology aims at is the extraction of abstract, goal-oriented “views" of the system, able to summarize relevant properties of the computation performed by the program. While concentrating on reverse engineering we had modeled the C++ files by designing the translator.
Abstract: The Block Sorting problem is to sort a given
permutation moving blocks. A block is defined as a substring
of the given permutation, which is also a substring of the
identity permutation. Block Sorting has been proved to be
NP-Hard. Until now two different 2-Approximation algorithms
have been presented for block sorting. These are the best known
algorithms for Block Sorting till date. In this work we present
a different characterization of Block Sorting in terms of a
transposition cycle graph. Then we suggest a heuristic,
which we show to exhibit a 2-approximation performance
guarantee for most permutations.
Abstract: The project was undertaken to determine the effects of modified tissue culture protocols e.g. age of culture and hormone levels (2,4-D) in generating somaclonal variation. Moreover, the utility of molecular markers (SSR and MSAP) in sorting off types/somaclones were investigated.
Results show that somaclonal variation is in effect due to prolonged subculture and high 2,4-D concentration. The resultant variation was observed to be due to high level of methylation events specifically cytosine methylation either at the internal or external cytosine and was identified by methylation sensitive amplification polymorphism (MSAP).Simple sequence repeats (SSR) on the other hand, was able to associate a marker to a trait of interest.
These therefore, show that molecular markers can be an important tool in sorting out variation/mutants at an early stage.
Abstract: In this paper, a method based on Non-Dominated
Sorting Genetic Algorithm (NSGA) has been presented for the Volt /
Var control in power distribution systems with dispersed generation
(DG). Genetic algorithm approach is used due to its broad
applicability, ease of use and high accuracy. The proposed method is
better suited for volt/var control problems. A multi-objective
optimization problem has been formulated for the volt/var control of
the distribution system. The non-dominated sorting genetic algorithm
based method proposed in this paper, alleviates the problem of tuning
the weighting factors required in solving the multi-objective volt/var
control optimization problems. Based on the simulation studies
carried out on the distribution system, the proposed scheme has been
found to be simple, accurate and easy to apply to solve the multiobjective
volt/var control optimization problem of the distribution
system with dispersed generation.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: This study introduces a new method for detecting,
sorting, and localizing spikes from multiunit EEG recordings. The
method combines the wavelet transform, which localizes distinctive
spike features, with Super-Paramagnetic Clustering (SPC) algorithm,
which allows automatic classification of the data without assumptions
such as low variance or Gaussian distributions. Moreover, the method
is capable of setting amplitude thresholds for spike detection. The
method makes use of several real EEG data sets, and accordingly the
spikes are detected, clustered and their times were detected.
Abstract: This paper solves the environmental/ economic dispatch
power system problem using the Non-dominated Sorting Genetic
Algorithm-II (NSGA-II) and its hybrid with a Convergence Accelerator
Operator (CAO), called the NSGA-II/CAO. These multiobjective
evolutionary algorithms were applied to the standard IEEE 30-bus
six-generator test system. Several optimization runs were carried out
on different cases of problem complexity. Different quality measure
which compare the performance of the two solution techniques were
considered. The results demonstrated that the inclusion of the CAO
in the original NSGA-II improves its convergence while preserving
the diversity properties of the solution set.
Abstract: Bioinformatics and computational biology involve
the use of techniques including applied mathematics,
informatics, statistics, computer science, artificial intelligence,
chemistry, and biochemistry to solve biological problems
usually on the molecular level. Research in computational
biology often overlaps with systems biology. Major research
efforts in the field include sequence alignment, gene finding,
genome assembly, protein structure alignment, protein structure
prediction, prediction of gene expression and proteinprotein
interactions, and the modeling of evolution. Various
global rearrangements of permutations, such as reversals and
transpositions,have recently become of interest because of their
applications in computational molecular biology. A reversal is
an operation that reverses the order of a substring of a permutation.
A transposition is an operation that swaps two adjacent
substrings of a permutation. The problem of determining the
smallest number of reversals required to transform a given
permutation into the identity permutation is called sorting by
reversals. Similar problems can be defined for transpositions
and other global rearrangements. In this work we perform a
study about some genome rearrangement primitives. We show
how a genome is modelled by a permutation, introduce some
of the existing primitives and the lower and upper bounds
on them. We then provide a comparison of the introduced
primitives.
Abstract: A data cutting and sorting method (DCSM) is proposed
to optimize the performance of data mining. DCSM reduces the
calculation time by getting rid of redundant data during the data
mining process. In addition, DCSM minimizes the computational units
by splitting the database and by sorting data with support counts. In the
process of searching for the relationship between metabolic syndrome
and lifestyles with the health examination database of an electronics
manufacturing company, DCSM demonstrates higher search
efficiency than the traditional Apriori algorithm in tests with different
support counts.
Abstract: The school / university orientation interests a broad and
often badly informed public. Technically, it is an important
multicriterion decision problem, which supposes the combination of
much academic professional and/or lawful knowledge, which in turn
justifies software resorting to the techniques of Artificial Intelligence.
CORUS is an expert system of the "Conseil et ORientation
Universitaire et Scolaire", based on a knowledge representation
language (KRL) with rules and objects, called/ known as Ibn Rochd.
CORUS was developed thanks to DéGSE, a workshop of cognitive
engineering which supports this LRC. CORUS works out many
acceptable solutions for the case considered, and retains the most
satisfactory among them. Several versions of CORUS have extended
its services gradually.