Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: In spite of all advancement in software testing,
debugging remains a labor-intensive, manual, time consuming, and
error prone process. A candidate solution to enhance debugging
process is to fuse it with testing process. To achieve this integration,
a possible solution may be categorizing common software tests and
errors followed by the effort on fixing the errors through general
solutions for each test/error pair. Our approach to address this issue is
based on Christopher Alexander-s pattern and pattern language
concepts. The patterns in this language are grouped into three major
sections and connect the three concepts of test, error, and debug.
These patterns and their hierarchical relationship shape a pattern
language that introduces a solution to solve software errors in a
known testing context.
Finally, we will introduce our developed framework ADE as a
sample implementation to support a pattern of proposed language,
which aims to automate the whole process of evolving software
design via evolutionary methods.
Abstract: Three-dimensional simulation of harmonic up
generation in free electron laser amplifier operating simultaneously
with a cold and relativistic electron beam is presented in steady-state
regime where the slippage of the electromagnetic wave with respect
to the electron beam is ignored. By using slowly varying envelope
approximation and applying the source-dependent expansion to wave
equations, electromagnetic fields are represented in terms of the
Hermit Gaussian modes which are well suited for the planar wiggler
configuration. The electron dynamics is described by the fully threedimensional
Lorentz force equation in presence of the realistic planar
magnetostatic wiggler and electromagnetic fields. A set of coupled
nonlinear first-order differential equations is derived and solved
numerically. The fundamental and third harmonic radiation of the
beam is considered. In addition to uniform beam, prebunched
electron beam has also been studied. For this effect of sinusoidal
distribution of entry times for the electron beam on the evolution of
radiation is compared with uniform distribution. It is shown that
prebunching reduces the saturation length substantially. For
efficiency enhancement the wiggler is set to decrease linearly when
the radiation of the third harmonic saturates. The optimum starting
point of tapering and the slope of radiation in the amplitude of
wiggler are found by successive run of the code.
Abstract: This study considers priorities of primary goals to increase policy efficiency of Green ICT. Recently several studies have been published that address how IT is linked to climate change. However, most of the previous studies are limited to Green ICT industrial statute and policy directions. This paper present Green ICT
policy making processes systematically. As a result of the analysis of
Korean Green ICT policy, the following emerged as important to accomplish for Green ICT policy: eco-friendliness, technology evolution, economic efficiency, energy efficiency, and stable supply
of energy. This is an initial study analyzing Green ICT policy, which provides an academic framework that can be used a guideline to
establish Green ICT policy.
Abstract: Adaptive Genetic Algorithms extend the Standard Gas
to use dynamic procedures to apply evolutionary operators such as
crossover, mutation and selection. In this paper, we try to propose a
new adaptive genetic algorithm, which is based on the statistical
information of the population as a guideline to tune its crossover,
selection and mutation operators. This algorithms is called Statistical
Genetic Algorithm and is compared with traditional GA in some
benchmark problems.
Abstract: This paper discusses applications of a revolutionary
information technology, Geographic Information Systems (GIS), in
the field of the history of cartography by examples, including
assessing accuracy of early maps, establishing a database of places
and historical administrative units in history, integrating early maps
in GIS or digital images, and analyzing social, political, and
economic information related to production of early maps. GIS
provides a new mean to evaluate the accuracy of early maps. Four
basic steps using GIS for this type of study are discussed. In addition,
several historical geographical information systems are introduced.
These include China Historical Geographic Information Systems
(CHGIS), the United States National Historical Geographic
Information System (NHGIS), and the Great Britain Historical
Geographical Information System. GIS also provides digital means to
display and analyze the spatial information on the early maps or to
layer them with modern spatial data. How GIS relational data
structure may be used to analyze social, political, and economic
information related to production of early maps is also discussed in
this paper. Through discussion on these examples, this paper reveals
value of GIS applications in this field.
Abstract: Segmentation techniques based on Active Contour
Models have been strongly benefited from the use of prior information
during their evolution. Shape prior information is captured from
a training set and is introduced in the optimization procedure to
restrict the evolution into allowable shapes. In this way, the evolution
converges onto regions even with weak boundaries. Although
significant effort has been devoted on different ways of capturing
and analyzing prior information, very little thought has been devoted
on the way of combining image information with prior information.
This paper focuses on a more natural way of incorporating the
prior information in the level set framework. For proof of concept
the method is applied on hippocampus segmentation in T1-MR
images. Hippocampus segmentation is a very challenging task, due
to the multivariate surrounding region and the missing boundary
with the neighboring amygdala, whose intensities are identical. The
proposed method, mimics the human segmentation way and thus
shows enhancements in the segmentation accuracy.
Abstract: Whole genome duplication (WGD) increased the
number of yeast Saccharomyces cerevisiae chromosomes from 8 to
16. In spite of retention the number of chromosomes in the genome
of this organism after WGD to date, chromosomal rearrangement
events have caused an evolutionary distance between current genome
and its ancestor. Studies under evolutionary-based approaches on
eukaryotic genomes have shown that the rearrangement distance is an
approximable problem. In the case of S. cerevisiae, we describe that
rearrangement distance is accessible by using dedoubled adjacency
graph drawn for 55 large paired chromosomal regions originated
from WGD. Then, we provide a program extracted from a C program
database to draw a dedoubled genome adjacency graph for S.
cerevisiae. From a bioinformatical perspective, using the duplicated
blocks of current genome in S. cerevisiae, we infer that genomic
organization of eukaryotes has the potential to provide valuable
detailed information about their ancestrygenome.
Abstract: This paper presents a new approach using Combined Artificial Neural Network (CANN) module for daily peak load forecasting. Five different computational techniques –Constrained method, Unconstrained method, Evolutionary Programming (EP), Particle Swarm Optimization (PSO), and Genetic Algorithm (GA) – have been used to identify the CANN module for peak load forecasting. In this paper, a set of neural networks has been trained with different architecture and training parameters. The networks are trained and tested for the actual load data of Chennai city (India). A set of better trained conventional ANNs are selected to develop a CANN module using different algorithms instead of using one best conventional ANN. Obtained results using CANN module confirm its validity.
Abstract: The evolution of technology and construction techniques has enabled the upgrading of transport networks. In particular, the high-speed rail networks allow convoys to peak at above 300 km/h. These structures, however, often significantly impact the surrounding environment. Among the effects of greater importance are the ones provoked by the soundwave connected to train transit. The wave propagation affects the quality of life in areas surrounding the tracks, often for several hundred metres. There are substantial damages to properties (buildings and land), in terms of market depreciation. The present study, integrating expertise in acoustics, computering and evaluation fields, outlines a useful model to select project paths so as to minimize the noise impact and reduce the causes of possible litigation. It also facilitates the rational selection of initiatives to contain the environmental damage to the already existing railway tracks. The research is developed with reference to the Italian regulatory framework (usually more stringent than European and international standards) and refers to a case study concerning the high speed network in Italy.
Abstract: The dynamic or complex modulus test is considered
to be a mechanistically based laboratory test to reliably characterize
the strength and load-resistance of Hot-Mix Asphalt (HMA) mixes
used in the construction of roads. The most common observation is
that the data collected from these tests are often noisy and somewhat
non-sinusoidal. This hampers accurate analysis of the data to obtain
engineering insight. The goal of the work presented in this paper is to
develop and compare automated evolutionary computational
techniques to filter test noise in the collection of data for the HMA
complex modulus test. The results showed that the Covariance
Matrix Adaptation-Evolutionary Strategy (CMA-ES) approach is
computationally efficient for filtering data obtained from the HMA
complex modulus test.
Abstract: Mining Sequential Patterns in large databases has become
an important data mining task with broad applications. It is
an important task in data mining field, which describes potential
sequenced relationships among items in a database. There are many
different algorithms introduced for this task. Conventional algorithms
can find the exact optimal Sequential Pattern rule but it takes a
long time, particularly when they are applied on large databases.
Nowadays, some evolutionary algorithms, such as Particle Swarm
Optimization and Genetic Algorithm, were proposed and have been
applied to solve this problem. This paper will introduce a new kind
of hybrid evolutionary algorithm that combines Genetic Algorithm
(GA) with Particle Swarm Optimization (PSO) to mine Sequential
Pattern, in order to improve the speed of evolutionary algorithms
convergence. This algorithm is referred to as SP-GAPSO.
Abstract: A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noise
Abstract: In this paper a systematic method via H∞ control
design is proposed to select a sensor set that satisfies a number
of input criteria for a MAGLEV suspension system. The proposed
method recovers a number of optimised controllers for each possible
sensor set that satisfies the performance and constraint criteria using
evolutionary algorithms.
Abstract: An effect of rolling temperature on the mechanical properties and microstructural evolution of an Al-Mg-Si alloy was studied. The material was rolled up to a true strain of ~0.7 at three different temperatures viz; room temperature, liquid propanol and liquid nitrogen. The liquid nitrogen rolled sample exhibited superior properties with a yield and tensile strength of 332 MPa and 364 MPa, respectively, with a reasonably good ductility of ~9%. The liquid nitrogen rolled sample showed around 54 MPa increase in tensile strength without much reduction in the ductility as compared to the as received T6 condition alloy. The microstructural details revealed equiaxed grains in the annealed and solutionized sample and elongated grains in the rolled samples. In addition, the cryorolled samples exhibited fine grain structure compared to the room temperature rolled samples.
Abstract: Reentry trajectory optimization is a multi-constraints
optimal control problem which is hard to solve. To tackle it, we
proposed a new algorithm named CDEN(Constrained Differential
Evolution Newton-Raphson Algorithm) based on Differential Evolution(
DE) and Newton-Raphson.We transform the infinite dimensional
optimal control problem to parameter optimization which is finite
dimensional by discretize control parameter. In order to simplify
the problem, we figure out the control parameter-s scope by process
constraints. To handle constraints, we proposed a parameterless constraints
handle process. Through comprehensive analyze the problem,
we use a new algorithm integrated by DE and Newton-Raphson to
solve it. It is validated by a reentry vehicle X-33, simulation results
indicated that the algorithm is effective and robust.
Abstract: The evolution in project management was triggered by
the changes in management philosophy and practices in order to
maintain competitive advantage and continuous success in the field.
The purpose of this paper is to highlight the practicality of cognitive
style and unlearning approach in influencing the achievement of
project success by project managers. It introduces the concept of
planning, knowing and creating style from cognitive style field in the
light of achieving time, cost, quality and stakeholders appreciation in
project success context. Further it takes up a discussion of the
unlearning approach as a moderator in enhancing the relationship
between cognitive style and project success. The paper bases itself on
literature review from established disciplines like psychology,
sociology and philosophy regarding cognitive style, unlearning and
project success in general. The analysis and synthesis of literature in
the subject area a conceptual paper is utilized as the basis of future
research to form a comprehensive framework for project managers in
enhancing the project management competency.
Abstract: The objective of this work was to examine the changes
in non destructive properties caused by carbonation of CEM II
mortar. Samples of CEM II mortar were prepared and subjected to
accelerated carbonation at 20°C, 65% relative humidity and 20% CO2
concentration. We examined the evolutions of the gas permeability,
the thermal conductivity, the thermal diffusivity, the volume of the
solid phase by helium pycnometry, the longitudinal and transverse
ultrasonic velocities. The principal contribution of this work is that,
apart of the gas permeability, changes in other non destructive
properties have never been studied during the carbonation of cement
materials. These properties are important in predicting/measuring the
durability of reinforced concrete in CO2 environment. The
carbonation depth and the porosity accessible to water were also
reported in order to explain comprehensively the changes in non
destructive parameters.
Abstract: We describe a work with an evolutionary computing
algorithm for non photo–realistic rendering of a target image. The
renderings are produced by genetic programming. We have used two
different types of strokes: “empty triangle" and “filled triangle" in
color level. We compare both empty and filled triangular strokes to
find which one generates more aesthetic pleasing images. We found
the filled triangular strokes have better fitness and generate more
aesthetic images than empty triangular strokes.
Abstract: This paper presents a tested research concept that
implements a complex evolutionary algorithm, genetic algorithm
(GA), in a multi-microcontroller environment. Parallel Distributed
Genetic Algorithm (PDGA) is employed in adaptive beam forming
technique to reduce power usage of adaptive antenna at WCDMA
base station. Adaptive antenna has dynamic beam that requires more
advanced beam forming algorithm such as genetic algorithm which
requires heavy computation and memory space. Microcontrollers are
low resource platforms that are normally not associated with GAs,
which are typically resource intensive. The aim of this project was to
design a cooperative multiprocessor system by expanding the role of
small scale PIC microcontrollers to optimize WCDMA base station
transmitter power. Implementation results have shown that PDGA
multi-microcontroller system returned optimal transmitted power
compared to conventional GA.