Abstract: The commercial finite element program LS-DYNA was employed to evaluate the response and energy absorbing capacity of cylindrical metal tubes that are externally wrapped with composite. The effects of composite wall thickness, loading conditions and fiber ply orientation were examined. The results demonstrate that a wrapped composite can be utilized effectively to enhance the crushing characteristics and energy absorbing capacity of the tubes. Increasing the thickness of the composite increases the mean force and the specific energy absorption under both static and dynamic crushing. The ply pattern affects the energy absorption capacity and the failure mode of the metal tube and the composite material property is also significant in determining energy absorption efficiency.
Abstract: Parallel Prefix addition is a technique for improving
the speed of binary addition. Due to continuing integrating intensity
and the growing needs of portable devices, low-power and highperformance
designs are of prime importance. The classical parallel
prefix adder structures presented in the literature over the years
optimize for logic depth, area, fan-out and interconnect count of logic
circuits. In this paper, a new architecture for performing 8-bit, 16-bit
and 32-bit Parallel Prefix addition is proposed. The proposed prefix
adder structures is compared with several classical adders of same
bit width in terms of power, delay and number of computational
nodes. The results reveal that the proposed structures have the least
power delay product when compared with its peer existing Prefix
adder structures. Tanner EDA tool was used for simulating the adder
designs in the TSMC 180 nm and TSMC 130 nm technologies.
Abstract: In this paper a combined feature selection method is
proposed which takes advantages of sample domain filtering,
resampling and feature subset evaluation methods to reduce
dimensions of huge datasets and select reliable features. This method
utilizes both feature space and sample domain to improve the process
of feature selection and uses a combination of Chi squared with
Consistency attribute evaluation methods to seek reliable features.
This method consists of two phases. The first phase filters and
resamples the sample domain and the second phase adopts a hybrid
procedure to find the optimal feature space by applying Chi squared,
Consistency subset evaluation methods and genetic search.
Experiments on various sized datasets from UCI Repository of
Machine Learning databases show that the performance of five
classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First
Decision Tree and JRIP) improves simultaneously and the
classification error for these classifiers decreases considerably. The
experiments also show that this method outperforms other feature
selection methods.
Abstract: In this paper, a new hybrid of genetic algorithm (GA)
and simulated annealing (SA), referred to as GSA, is presented. In
this algorithm, SA is incorporated into GA to escape from local
optima. The concept of hierarchical parallel GA is employed to
parallelize GSA for the optimization of multimodal functions. In
addition, multi-niche crowding is used to maintain the diversity in
the population of the parallel GSA (PGSA). The performance of the
proposed algorithms is evaluated against a standard set of multimodal
benchmark functions. The multi-niche crowding PGSA and normal
PGSA show some remarkable improvement in comparison with the
conventional parallel genetic algorithm and the breeder genetic
algorithm (BGA).
Abstract: In this paper we have proposed a novel dynamic least cost multicast routing protocol using hybrid genetic algorithm for IP networks. Our protocol finds the multicast tree with minimum cost subject to delay, degree, and bandwidth constraints. The proposed protocol has the following features: i. Heuristic local search function has been devised and embedded with normal genetic operation to increase the speed and to get the optimized tree, ii. It is efficient to handle the dynamic situation arises due to either change in the multicast group membership or node / link failure, iii. Two different crossover and mutation probabilities have been used for maintaining the diversity of solution and quick convergence. The simulation results have shown that our proposed protocol generates dynamic multicast tree with lower cost. Results have also shown that the proposed algorithm has better convergence rate, better dynamic request success rate and less execution time than other existing algorithms. Effects of degree and delay constraints have also been analyzed for the multicast tree interns of search success rate.
Abstract: One of the main environmental problems which affect extensive areas in the world is soil salinity. Traditional data collection methods are neither enough for considering this important environmental problem nor accurate for soil studies. Remote sensing data could overcome most of these problems. Although satellite images are commonly used for these studies, however there are still needs to find the best calibration between the data and real situations in each specified area. Neyshaboor area, North East of Iran was selected as a field study of this research. Landsat satellite images for this area were used in order to prepare suitable learning samples for processing and classifying the images. 300 locations were selected randomly in the area to collect soil samples and finally 273 locations were reselected for further laboratory works and image processing analysis. Electrical conductivity of all samples was measured. Six reflective bands of ETM+ satellite images taken from the study area in 2002 were used for soil salinity classification. The classification was carried out using common algorithms based on the best composition bands. The results showed that the reflective bands 7, 3, 4 and 1 are the best band composition for preparing the color composite images. We also found out, that hybrid classification is a suitable method for identifying and delineation of different salinity classes in the area.
Abstract: An advanced composite flywheel rotor consisting of
intra and inter hybrid rims was designed to optimally increase the energy capacity, and was manufactured using filament winding with
in-situ curing. The flywheel has recently attracted considerable attention from many investigators since it possesses great potential in
many energy storage applications, including electric utilities, hybrid or
electric automobiles, and space vehicles. In this investigation, a comprehensive study was conducted with the intent to implement
composites in high performance flywheel applications.The inner two
intra-hybrid rims (rims 1 and 2) were manufactured as a whole part
through continuous filament winding under in-situ curing conditions,
and so were the outer two rims (rims 3 and 4). The outer surface of rim
2 and the inner surface of rim 3 were CNC-tapered for press-fitting. Machined rims were finally press-fitted using a hydraulic press with a
maximum compressive force of approximately 1000 ton.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques to extract phrases
from documents and the query and doing stemming on words. Then
an ontology based conceptual method will be used to annotate
documents and expand the query. To expand a query the spread
activation algorithm is improved so that the expansion can be done
flexible and in various aspects. The annotated documents and the
expanded query will be processed to compute the relevance degree
exploiting statistical methods. The outstanding features of our
approach are (1) combining conceptual, statistical and linguistic
features of documents, (2) expanding the query with its related
concepts before comparing to documents, (3) extracting and using
both words and phrases to compute relevance degree, (4) improving
the spread activation algorithm to do the expansion based on
weighted combination of different conceptual relationships and (5)
allowing variable document vector dimensions. A ranking system
called ORank is developed to implement and test the proposed
model. The test results will be included at the end of the paper.
Abstract: This paper explores university course timetabling
problem. There are several characteristics that make scheduling and
timetabling problems particularly difficult to solve: they have huge
search spaces, they are often highly constrained, they require
sophisticated solution representation schemes, and they usually
require very time-consuming fitness evaluation routines. Thus
standard evolutionary algorithms lack of efficiency to deal with
them. In this paper we have proposed a memetic algorithm that
incorporates the problem specific knowledge such that most of
chromosomes generated are decoded into feasible solutions.
Generating vast amount of feasible chromosomes makes the progress
of search process possible in a time efficient manner. Experimental
results exhibit the advantages of the developed Hybrid Genetic
Algorithm than the standard Genetic Algorithm.
Abstract: In this paper, we introduce a new method for elliptical
object identification. The proposed method adopts a hybrid scheme
which consists of Eigen values of covariance matrices, Circular
Hough transform and Bresenham-s raster scan algorithms. In this
approach we use the fact that the large Eigen values and small Eigen
values of covariance matrices are associated with the major and minor
axial lengths of the ellipse. The centre location of the ellipse can be
identified using circular Hough transform (CHT). Sparse matrix
technique is used to perform CHT. Since sparse matrices squeeze zero
elements and contain a small number of nonzero elements they
provide an advantage of matrix storage space and computational time.
Neighborhood suppression scheme is used to find the valid Hough
peaks. The accurate position of circumference pixels is identified
using raster scan algorithm which uses the geometrical symmetry
property. This method does not require the evaluation of tangents or
curvature of edge contours, which are generally very sensitive to
noise working conditions. The proposed method has the advantages of
small storage, high speed and accuracy in identifying the feature. The
new method has been tested on both synthetic and real images.
Several experiments have been conducted on various images with
considerable background noise to reveal the efficacy and robustness.
Experimental results about the accuracy of the proposed method,
comparisons with Hough transform and its variants and other
tangential based methods are reported.
Abstract: Data stream analysis is the process of computing
various summaries and derived values from large amounts of data
which are continuously generated at a rapid rate. The nature of a
stream does not allow a revisit on each data element. Furthermore,
data processing must be fast to produce timely analysis results. These
requirements impose constraints on the design of the algorithms to
balance correctness against timely responses. Several techniques
have been proposed over the past few years to address these
challenges. These techniques can be categorized as either dataoriented
or task-oriented. The data-oriented approach analyzes a
subset of data or a smaller transformed representation, whereas taskoriented
scheme solves the problem directly via approximation
techniques. We propose a hybrid approach to tackle the data stream
analysis problem. The data stream has been both statistically
transformed to a smaller size and computationally approximated its
characteristics. We adopt a Monte Carlo method in the approximation
step. The data reduction has been performed horizontally and
vertically through our EMR sampling method. The proposed method
is analyzed by a series of experiments. We apply our algorithm on
clustering and classification tasks to evaluate the utility of our
approach.
Abstract: In this paper we present a new method for coin
identification. The proposed method adopts a hybrid scheme using
Eigenvalues of covariance matrix, Circular Hough Transform (CHT)
and Bresenham-s circle algorithm. The statistical and geometrical
properties of the small and large Eigenvalues of the covariance
matrix of a set of edge pixels over a connected region of support are
explored for the purpose of circular object detection. Sparse matrix
technique is used to perform CHT. Since sparse matrices squeeze
zero elements and contain only a small number of non-zero elements,
they provide an advantage of matrix storage space and computational
time. Neighborhood suppression scheme is used to find the valid
Hough peaks. The accurate position of the circumference pixels is
identified using Raster scan algorithm which uses geometrical
symmetry property. After finding circular objects, the proposed
method uses the texture on the surface of the coins called texton,
which are unique properties of coins, refers to the fundamental micro
structure in generic natural images. This method has been tested on
several real world images including coin and non-coin images. The
performance is also evaluated based on the noise withstanding
capability.
Abstract: Localization is one of the critical issues in the field of
robot navigation. With an accurate estimate of the robot pose, robots will be capable of navigating in the environment autonomously and efficiently. In this paper, a hybrid Distributed Vision System (DVS)
for robot localization is presented. The presented approach integrates
odometry data from robot and images captured from overhead cameras
installed in the environment to help reduce possibilities of fail
localization due to effects of illumination, encoder accumulated errors,
and low quality range data. An odometry-based motion model is applied to predict robot poses, and robot images captured by overhead
cameras are then used to update pose estimates with HSV histogram-based measurement model. Experiment results show the
presented approach could localize robots in a global world coordinate system with localization errors within 100mm.
Abstract: In molecular biology, microarray technology is widely and successfully utilized to efficiently measure gene activity. If working with less studied organisms, methods to design custom-made microarray probes are available. One design criterion is to select probes with minimal melting temperature variances thus ensuring similar hybridization properties. If the microarray application focuses on the investigation of metabolic pathways, it is not necessary to cover the whole genome. It is more efficient to cover each metabolic pathway with a limited number of genes. Firstly, an approach is presented which minimizes the overall melting temperature variance of selected probes for all genes of interest. Secondly, the approach is extended to include the additional constraints of covering all pathways with a limited number of genes while minimizing the overall variance. The new optimization problem is solved by a bottom-up programming approach which reduces the complexity to make it computationally feasible. The new method is exemplary applied for the selection of microarray probes in order to cover all fungal secondary metabolite gene clusters for Aspergillus terreus.
Abstract: Permanent magnet synchronous machines are known
as a good candidate for hybrid electric vehicles due to their unique
merits. However they have two major drawbacks i.e. high cost and
small speed range. In this paper an optimal design of a permanent
magnet machine is presented. A reduction of permanent magnet
material for a constant torque and an extension in speed and torque
ranges are chosen as the optimization aims. For this purpose the
analytical model of the permanent magnet synchronous machine is
derived and the appropriate design algorithm is devised. The genetic
algorithm is then employed to optimize some machine specifications.
Finally the finite element method is used to validate the designed
machine.
Abstract: The purpose of Grid computing is to utilize
computational power of idle resources which are distributed in
different areas. Given the grid dynamism and its decentralize
resources, there is a need for an efficient scheduler for scheduling
applications. Since task scheduling includes in the NP-hard problems
various researches have focused on invented algorithms especially
the genetic ones. But since genetic is an inherent algorithm which
searches the problem space globally and does not have the efficiency
required for local searching, therefore, its combination with local
searching algorithms can compensate for this shortcomings. The aim
of this paper is to combine the genetic algorithm and GELS (GAGELS)
as a method to solve scheduling problem by which
simultaneously pay attention to two factors of time and number of
missed tasks. Results show that the proposed algorithm can decrease
makespan while minimizing the number of missed tasks compared
with the traditional methods.
Abstract: Tumour suppressors are key participants in the
prevention of cancer. Regulation of their expression through
miRNAs is important for comprehensive translation inhibition of
tumour suppressors and elucidation of carcinogenesis mechanisms.
We studies the possibility of 1521 miRNAs to bind with 873 mRNAs
of human tumour suppressors using RNAHybrid 2.1 and ERNAhybrid
programmes. Only 978 miRNAs were found to be
translational regulators of 812 mRNAs, and 61 mRNAs did not have
any miRNA binding sites. Additionally, 45.9% of all miRNA binding
sites were located in coding sequences (CDSs), 33.8% were located
in 3' untranslated region (UTR), and 20.3% were located in the
5'UTR. MiRNAs binding with more than 50 target mRNAs and
mRNAs binding with several miRNAs were selected. Hsa-miR-5096
had 15 perfectly complementary binding sites with mRNAs of 14
tumour suppressors. These newly indentified miRNA binding sites
can be used in the development of medicines (anti-sense therapies)
for cancer treatment.
Abstract: A novel concept to balance and tradeoff between
make-to-stock and make-to-order has been hybrid MTS/MTO production context. One of the most important decisions involved in
the hybrid MTS/MTO environment is determining whether a product
is manufactured to stock, to order, or hybrid MTS/MTO strategy. In this paper, a model based on analytic network process is developed to tackle the addressed decision. Since the regarded decision deals with
the uncertainty and ambiguity of data as well as experts- and
managers- linguistic judgments, the proposed model is equipped with
fuzzy sets theory. An important attribute of the model is its generality due to diverse decision factors which are elicited from the
literature and developed by the authors. Finally, the model is validated by applying to a real case study to reveal how the proposed
model can actually be implemented.
Abstract: In aerospace applications, interactions of airflow with
aircraft structures can result in undesirable structural deformations.
This structural deformation in turn, can be predicted if the natural
modes of the structure are known. This can be achieved through
conventional modal testing that requires a known excitation force in
order to extract these dynamic properties. This technique can be
experimentally complex because of the need for artificial excitation
and it is also does not represent actual operational condition. The
current work presents part of research work that address the practical
implementation of operational modal analysis (OMA) applied to a
cantilevered hybrid composite plate employing single contactless
sensing system via laser vibrometer. OMA technique extracts the
modal parameters based only on the measurements of the dynamic
response. The OMA results were verified with impact hammer modal
testing and good agreement was obtained.
Abstract: This work presents a new algorithm based on a combination of fuzzy (FUZ), Dynamic Programming (DP), and Genetic Algorithm (GA) approach for capacitor allocation in distribution feeders. The problem formulation considers two distinct objectives related to total cost of power loss and total cost of capacitors including the purchase and installation costs. The novel formulation is a multi-objective and non-differentiable optimization problem. The proposed method of this article uses fuzzy reasoning for sitting of capacitors in radial distribution feeders, DP for sizing and finally GA for finding the optimum shape of membership functions which are used in fuzzy reasoning stage. The proposed method has been implemented in a software package and its effectiveness has been verified through a 9-bus radial distribution feeder for the sake of conclusions supports. A comparison has been done among the proposed method of this paper and similar methods in other research works that shows the effectiveness of the proposed method of this paper for solving optimum capacitor planning problem.