Abstract: Flexible Job Shop Problem (FJSP) is an extension of
classical Job Shop Problem (JSP). The FJSP extends the routing
flexibility of the JSP, i.e assigning machine to an operation. Thus it
makes it more difficult than the JSP. In this study, Cooperative Coevolutionary
Genetic Algorithm (CCGA) is presented to solve the
FJSP. Makespan (time needed to complete all jobs) is used as the
performance evaluation for CCGA. In order to test performance and
efficiency of our CCGA the benchmark problems are solved.
Computational result shows that the proposed CCGA is comparable
with other approaches.
Abstract: This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA
datasets. An important challenge in reliably detecting aortic is the
need to overcome problems associated with intensity
inhomogeneities. Level sets are part of an important class of methods
that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the
level set formulation aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level
sets, and are shown to be more effective than other approaches in
coping with intensity inhomogeneities. We have applied the Courant
Friedrichs Levy (CFL) condition as stability criterion for our algorithm.
Abstract: Partitioning is a critical area of VLSI CAD. In order to build complex digital logic circuits its often essential to sub-divide multi -million transistor design into manageable Pieces. This paper looks at the various partitioning techniques aspects of VLSI CAD, targeted at various applications. We proposed an evolutionary time-series model and a statistical glitch prediction system using a neural network with selection of global feature by making use of clustering method model, for partitioning a circuit. For evolutionary time-series model, we made use of genetic, memetic & neuro-memetic techniques. Our work focused in use of clustering methods - K-means & EM methodology. A comparative study is provided for all techniques to solve the problem of circuit partitioning pertaining to VLSI design. The performance of all approaches is compared using benchmark data provided by MCNC standard cell placement benchmark net lists. Analysis of the investigational results proved that the Neuro-memetic model achieves greater performance then other model in recognizing sub-circuits with minimum amount of interconnections between them.
Abstract: Visualizing sound and noise often help us to determine
an appropriate control over the source localization. Near-field acoustic
holography (NAH) is a powerful tool for the ill-posed problem.
However, in practice, due to the small finite aperture size, the discrete
Fourier transform, FFT based NAH couldn-t predict the activeregion-
of-interest (AROI) over the edges of the plane. Theoretically
few approaches were proposed for solving finite aperture problem.
However most of these methods are not quite compatible for the
practical implementation, especially near the edge of the source. In
this paper, a zip-stuffing extrapolation approach has suggested with
2D Kaiser window. It is operated on wavenumber complex space
to localize the predicted sources. We numerically form a practice
environment with touch impact databases to test the localization of
sound source. It is observed that zip-stuffing aperture extrapolation
and 2D window with evanescent components provide more accuracy
especially in the small aperture and its derivatives.
Abstract: The paper provides a discussion of the most relevant
aspects of yield curve modeling. Two classes of models are
considered: stochastic and parsimonious function based, through the
approaches developed by Vasicek (1977) and Nelson and Siegel
(1987). Yield curve estimates for Croatia are presented and their
dynamics analyzed and finally, a comparative analysis of models is
conducted.
Abstract: This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: In this research, we propose a weighted class based
queuing (WCBQ) mechanism to provide class differentiation and to
reduce the load for the IMS (IP Multimedia Subsystem) presence
server (PS). The tasks of admission controller for the PS are
demonstrated. Analysis and simulation models are developed to
quantify the performance of WCBQ scheme. An optimized dropping
time frame has been developed based on which some of the preexisting
messages are dropped from the PS-buffer. Cost functions are
developed and simulation comparison has been performed with FCFS
(First Come First Served) scheme. The results show that the PS
benefits significantly from the proposed queuing and dropping
algorithm (WCBQ) during heavy traffic.
Abstract: This paper offers suggestions for educators at all levels about how to better prepare our students for the future, by building on the past. The discussion begins with a summary of changes in the World Wide Web, especially as the term Web 3.0 is being heard. The bulk of the discussion is retrospective and concerned with an overview of traditional teaching and research approaches as they evolved during the 20th century beginning with those grounded in the Cartesian reality of IA Richards- (1929) Practical Criticism. The paper concludes with a proposal of five strategies which incorporate timeless elements from the past as well as cutting-edge elements from today, in order to better prepare our students for the future.
Abstract: Insufficient Quality of Service (QoS) of Voice over
Internet Protocol (VoIP) is a growing concern that has lead the need
for research and study. In this paper we investigate the performance
of VoIP and the impact of resource limitations on the performance of
Access Networks. The impact of VoIP performance in Access
Networks is particularly important in regions where Internet
resources are limited and the cost of improving these resources is
prohibitive. It is clear that perceived VoIP performance, as measured
by mean opinion score [2] in experiments, where subjects are asked
to rate communication quality, is determined by end-to-end delay on
the communication path, delay variation, packet loss, echo, the
coding algorithm in use and noise. These performance indicators can
be measured and the affect in the Access Network can be estimated.
This paper investigates the congestion in the Access Network to the
overall performance of VoIP services with the presence of other
substantial uses of internet and ways in which Access Networks can
be designed to improve VoIP performance. Methods for analyzing
the impact of the Access Network on VoIP performance will be
surveyed and reviewed. This paper also considers some approaches
for improving performance of VoIP by carrying out experiments
using Network Simulator version 2 (NS2) software with a view to
gaining a better understanding of the design of Access Networks.
Abstract: The concentrations of As, Hg, Co, Cr and Cd were
tested for each soil sample, and their spatial patterns were analyzed
by the semivariogram approach of geostatistics and geographical
information system technology. Multivariate statistic approaches
(principal component analysis and cluster analysis) were used to
identify heavy metal sources and their spatial pattern. Principal
component analysis coupled with correlation between heavy metals
showed that primary inputs of As, Hg and Cd were due to
anthropogenic while, Co, and Cr were associated with pedogenic
factors. Ordinary kriging was carried out to map the spatial patters of
heavy metals. The high pollution sources evaluated was related with
usage of urban and industrial wastewater. The results of this study
helpful for risk assessment of environmental pollution for decision
making for industrial adjustment and remedy soil pollution.
Abstract: Reliable information about tool temperature
distribution is of central importance in metal cutting. In this study,
tool-chip interface temperature was determined in cutting of ST37
steel workpiece by applying HSS as the cutting tool in dry turning.
Two different approaches were implemented for temperature
measuring: an embedded thermocouple (RTD) in to the cutting tool
and infrared (IR) camera. Comparisons are made between
experimental data and results of MSC.SuperForm and FLUENT
software.
An investigation of heat generation in cutting tool was performed
by varying cutting parameters at the stable cutting tool geometry and
results were saved in a computer; then the diagrams of tool
temperature vs. various cutting parameters were obtained. The
experimental results reveal that the main factors of the increasing
cutting temperature are cutting speed (V ), feed rate ( S ) and depth
of cut ( h ), respectively. It was also determined that simultaneously
change in cutting speed and feed rate has the maximum effect on
increasing cutting temperature.
Abstract: The aim of this paper is to present a comparative
study on two different methods for the evaluation of the equilibrium
point of a ship, core issue for designing an On Board Stability System
(OBSS) module that, starting from geometry information of a ship
hull, described by a discrete model in a standard format, and the
distribution of all weights onboard calculates the ship floating
conditions (in draught, heel and trim).
Abstract: This paper reviews various approaches that have been
used for the modeling and simulation of large-scale engineering
systems and determines their appropriateness in the development of a
RICS modeling and simulation tool. Bond graphs, linear graphs,
block diagrams, differential and difference equations, modeling
languages, cellular automata and agents are reviewed. This tool
should be based on linear graph representation and supports symbolic
programming, functional programming, the development of noncausal
models and the incorporation of decentralized approaches.
Abstract: This paper describes studies carried out to investigate
the viability of using wireless cameras as a tool in monitoring
changes in air quality. A camera is used to monitor the change in
colour of a chemically responsive polymer within view of the camera
as it is exposed to varying chemical species concentration levels. The
camera captures this image and the colour change is analyzed by
averaging the RGB values present. This novel chemical sensing
approach is compared with an established chemical sensing method
using the same chemically responsive polymer coated onto LEDs. In
this way, the concentration levels of acetic acid in the air can be
tracked using both approaches. These approaches to chemical plume
tracking have many applications for air quality monitoring.
Abstract: Years of extensive research in the field of speech
processing for compression and recognition in the last five decades,
resulted in a severe competition among the various methods and
paradigms introduced. In this paper we include the different representations
of speech in the time-frequency and time-scale domains
for the purpose of compression and recognition. The examination of
these representations in a variety of related work is accomplished.
In particular, we emphasize methods related to Fourier analysis
paradigms and wavelet based ones along with the advantages and
disadvantages of both approaches.
Abstract: Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.
Abstract: Corporate credit rating prediction using statistical and
artificial intelligence (AI) techniques has been one of the attractive
research topics in the literature. In recent years, multiclass
classification models such as artificial neural network (ANN) or
multiclass support vector machine (MSVM) have become a very
appealing machine learning approaches due to their good
performance. However, most of them have only focused on classifying
samples into nominal categories, thus the unique characteristic of the
credit rating - ordinality - has been seldom considered in their
approaches. This study proposes new types of ANN and MSVM
classifiers, which are named OMANN and OMSVM respectively.
OMANN and OMSVM are designed to extend binary ANN or SVM
classifiers by applying ordinal pairwise partitioning (OPP) strategy.
These models can handle ordinal multiple classes efficiently and
effectively. To validate the usefulness of these two models, we applied
them to the real-world bond rating case. We compared the results of
our models to those of conventional approaches. The experimental
results showed that our proposed models improve classification
accuracy in comparison to typical multiclass classification techniques
with the reduced computation resource.
Abstract: Facility Layout Problem (FLP) is one of the essential
problems of several types of manufacturing and service sector. It is
an optimization problem on which the main objective is to obtain the
efficient locations, arrangement and order of the facilities. In the
literature, there are numerous facility layout problem research
presented and have used meta-heuristic approaches to achieve
optimal facility layout design. This paper presented genetic algorithm
to solve facility layout problem; to minimize total cost function. The
performance of the proposed approach was verified and compared
using problems in the literature.
Abstract: Whole genome duplication (WGD) increased the
number of yeast Saccharomyces cerevisiae chromosomes from 8 to
16. In spite of retention the number of chromosomes in the genome
of this organism after WGD to date, chromosomal rearrangement
events have caused an evolutionary distance between current genome
and its ancestor. Studies under evolutionary-based approaches on
eukaryotic genomes have shown that the rearrangement distance is an
approximable problem. In the case of S. cerevisiae, we describe that
rearrangement distance is accessible by using dedoubled adjacency
graph drawn for 55 large paired chromosomal regions originated
from WGD. Then, we provide a program extracted from a C program
database to draw a dedoubled genome adjacency graph for S.
cerevisiae. From a bioinformatical perspective, using the duplicated
blocks of current genome in S. cerevisiae, we infer that genomic
organization of eukaryotes has the potential to provide valuable
detailed information about their ancestrygenome.