Abstract: Multicarrier code-division multiple-access is one of the
effective techniques to gain its multiple access capability, robustness
against fading, and to mitigate the ISI. In this paper, we propose an
improved mulcarrier CDMA system with adaptive subchannel
allocation. We analyzed the performance of our proposed system in
frequency selective fading environment with narrowband interference
existing and compared it with that of parallel transmission over many
subchannels (namely, conventional MC-CDMA scheme) and
DS-CDMA system. Simulation results show that adaptive subchannel
allocation scheme, when used in conventional multicarrier CDMA
system, the performance will be greatly improved.
Abstract: This paper presents the development techniques
for a complete autonomous design model of an advanced train
control system and gives a new approach for the
implementation of multi-agents based system. This research
work proposes to develop a novel control system to enhance
the efficiency of the vehicles under constraints of various
conditions, and contributes in stability and controllability
issues, considering relevant safety and operational
requirements with command control communication and
various sensors to avoid accidents. The approach of speed
scheduling, management and control in local and distributed
environment is given to fulfill the dire needs of modern trend
and enhance the vehicles control systems in automation. These
techniques suggest the state of the art microelectronic
technology with accuracy and stability as forefront goals.
Abstract: Rapid prototyping (RP) techniques are a group of
advanced manufacturing processes that can produce custom made
objects directly from computer data such as Computer Aided Design
(CAD), Computed Tomography (CT) and Magnetic Resonance
Imaging (MRI) data. Using RP fabrication techniques, constructs
with controllable and complex internal architecture with appropriate
mechanical properties can be achieved. One of the attractive and
promising utilization of RP techniques is related to tissue engineering
(TE) scaffold fabrication. Tissue engineering scaffold is a 3D
construction that acts as a template for tissue regeneration. Although
several conventional techniques such as solvent casting and gas
forming are utilized in scaffold fabrication; these processes show
poor interconnectivity and uncontrollable porosity of the produced
scaffolds. So, RP techniques become the best alternative fabrication
methods of TE scaffolds. This paper reviews the current state of the
art in the area of tissue engineering scaffolds fabrication using
advanced RP processes, as well as the current limitations and future
trends in scaffold fabrication RP techniques.
Abstract: In this article an evolutionary technique has been used
for the solution of nonlinear Riccati differential equations of fractional order. In this method, genetic algorithm is used as a tool for
the competent global search method hybridized with active-set algorithm for efficient local search. The proposed method has been
successfully applied to solve the different forms of Riccati
differential equations. The strength of proposed method has in its
equal applicability for the integer order case, as well as, fractional
order case. Comparison of the method has been made with standard
numerical techniques as well as the analytic solutions. It is found
that the designed method can provide the solution to the equation
with better accuracy than its counterpart deterministic approaches.
Another advantage of the given approach is to provide results on
entire finite continuous domain unlike other numerical methods
which provide solutions only on discrete grid of points.
Abstract: Electromagnetic interference (EMI) is one of the
serious problems in most electrical and electronic appliances
including fluorescent lamps. The electronic ballast used to regulate
the power flow through the lamp is the major cause for EMI. The
interference is because of the high frequency switching operation of
the ballast. Formerly, some EMI mitigation techniques were in
practice, but they were not satisfactory because of the hardware
complexity in the circuit design, increased parasitic components and
power consumption and so on. The majority of the researchers have
their spotlight only on EMI mitigation without considering the other
constraints such as cost, effective operation of the equipment etc. In
this paper, we propose a technique for EMI mitigation in fluorescent
lamps by integrating Frequency Modulation and Evolutionary
Programming. By the Frequency Modulation technique, the
switching at a single central frequency is extended to a range of
frequencies, and so, the power is distributed throughout the range of
frequencies leading to EMI mitigation. But in order to meet the
operating frequency of the ballast and the operating power of the
fluorescent lamps, an optimal modulation index is necessary for
Frequency Modulation. The optimal modulation index is determined
using Evolutionary Programming. Thereby, the proposed technique
mitigates the EMI to a satisfactory level without disturbing the
operation of the fluorescent lamp.
Abstract: The analysis of Acoustic Emission (AE) signal
generated from metal cutting processes has often approached
statistically. This is due to the stochastic nature of the emission
signal as a result of factors effecting the signal from its generation
through transmission and sensing. Different techniques are applied in
this manner, each of which is suitable for certain processes. In metal
cutting where the emission generated by the deformation process is
rather continuous, an appropriate method for analysing the AE signal
based on the root mean square (RMS) of the signal is often used and
is suitable for use with the conventional signal processing systems.
The aim of this paper is to set a strategy in tool failure detection in
turning processes via the statistic analysis of the AE generated from
the cutting zone. The strategy is based on the investigation of the
distribution moments of the AE signal at predetermined sampling.
The skews and kurtosis of these distributions are the key elements in
the detection. A normal (Gaussian) distribution has first been
suggested then this was eliminated due to insufficiency. The so
called Beta distribution was then considered, this has been used with
an assumed β density function and has given promising results with
regard to chipping and tool breakage detection.
Abstract: Biometrics methods include recognition techniques
such as fingerprint, iris, hand geometry, voice, face, ears and gait. The gait recognition approach has some advantages, for example it
does not need the prior concern of the observed subject and it can
record many biometric features in order to make deeper analysis, but
most of the research proposals use high computational cost. This
paper shows a gait recognition system with feature subtraction on a
bundle rectangle drawn over the observed person. Statistical results
within a database of 500 videos are shown.
Abstract: In this paper a procedure for the split-pipe design of looped water distribution network based on the use of simulated annealing is proposed. Simulated annealing is a heuristic-based search algorithm, motivated by an analogy of physical annealing in solids. It is capable for solving the combinatorial optimization problem. In contrast to the split-pipe design that is derived from a continuous diameter design that has been implemented in conventional optimization techniques, the split-pipe design proposed in this paper is derived from a discrete diameter design where a set of pipe diameters is chosen directly from a specified set of commercial pipes. The optimality and feasibility of the solutions are found to be guaranteed by using the proposed method. The performance of the proposed procedure is demonstrated through solving the three well-known problems of water distribution network taken from the literature. Simulated annealing provides very promising solutions and the lowest-cost solutions are found for all of these test problems. The results obtained from these applications show that simulated annealing is able to handle a combinatorial optimization problem of the least cost design of water distribution network. The technique can be considered as an alternative tool for similar areas of research. Further applications and improvements of the technique are expected as well.
Abstract: The effect of chemical treatment in CdCl2 and thermal
annealing in 400°C, on the defect structures of potentially useful
ZnS\CdS solar cell thin films deposited onto quartz substrate and
prepared by vacuum deposition method was studied using the
Thermoluminesence (TL) techniques. A series of electron and hole
traps are found in the various deposited samples studied. After
annealing, however, it was observed that the intensity and activation
energy of TL signal increases with loss of the low temperature
electron traps.
Abstract: Intelligent systems based on machine learning
techniques, such as classification, clustering, are gaining wide spread
popularity in real world applications. This paper presents work on
developing a software system for predicting crop yield, for example
oil-palm yield, from climate and plantation data. At the core of our
system is a method for unsupervised partitioning of data for finding
spatio-temporal patterns in climate data using kernel methods which
offer strength to deal with complex data. This work gets inspiration
from the notion that a non-linear data transformation into some high
dimensional feature space increases the possibility of linear
separability of the patterns in the transformed space. Therefore, it
simplifies exploration of the associated structure in the data. Kernel
methods implicitly perform a non-linear mapping of the input data
into a high dimensional feature space by replacing the inner products
with an appropriate positive definite function. In this paper we
present a robust weighted kernel k-means algorithm incorporating
spatial constraints for clustering the data. The proposed algorithm
can effectively handle noise, outliers and auto-correlation in the
spatial data, for effective and efficient data analysis by exploring
patterns and structures in the data, and thus can be used for
predicting oil-palm yield by analyzing various factors affecting the
yield.
Abstract: This article stands in the context of rural communities
in Brazil, where, like many others emerging countries, the
overwhelming increasing markets and the overcrowded cities are
leaving behind informal settlements based on obsolete agricultural
economies and techniques. The pilot project for the community of
Goiabeira reflects the attempt to imagine a development model that
privileges the actual improvement of living conditions, the education
and training, the social inclusion and participation of the dwellers of
rural communities. Through the inclusion of operative public space,
the aim is for them to become self-sustaining, encouraging the use of
local resources for appropriate architectural, ecological and energy
technologies and devices, that are efficient, affordable and foster
community participation, in the respect of the surrounding
environment.
Abstract: With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Abstract: Initial values of reference vectors have significant influence on recognition accuracy in LVQ. There are several existing techniques, such as SOM and k-means, for setting initial values of reference vectors, each of which has provided some positive results. However, those results are not sufficient for the improvement of recognition accuracy. This study proposes an ACO-used method for initializing reference vectors with an aim to achieve recognition accuracy higher than those obtained through conventional methods. Moreover, we will demonstrate the effectiveness of the proposed method by applying it to the wine data and English vowel data and comparing its results with those of conventional methods.
Abstract: This paper is a description approach to predict
incoming and outgoing data rate in network system by using
association rule discover, which is one of the data mining
techniques. Information of incoming and outgoing data in each
times and network bandwidth are network performance
parameters, which needed to solve in the traffic problem. Since
congestion and data loss are important network problems. The result
of this technique can predicted future network traffic. In addition,
this research is useful for network routing selection and network
performance improvement.
Abstract: Within the healthcare system, training and continued professional development although essential, can be effected by cost and logistical restraints due to the nature of healthcare provision e.g employee shift patterns, access to expertise, cost factors in releasing staff to attend training etc. The use of multimedia technology for the development of e-learning applications is also a major cost consideration for healthcare management staff, and this type of media whether optical or on line requires careful planning in order to remain inclusive of all staff with potentially varied access to multimedia computing. This paper discusses a project in which the use of DVD authoring technology has been successfully implemented to meet the needs of distance learning and user considerations, and is based on film production techniques and reduced product turnaround deadlines.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. If the
framework contains defects, the defects will be passed on to the
applications developed from the framework. Framework defects are
hard to discover at the time the framework is instantiated. Therefore,
it is important to remove all defects before instantiating the
framework. In this paper, two measures for the adequacy of an
object-oriented system-based testing technique are introduced. The
measures assess the usefulness and uniqueness of the testing
technique. The two measures are applied to experimentally compare
the adequacy of two testing techniques introduced to test objectoriented
frameworks at the system level. The two considered testing
techniques are the New Framework Test Approach and Testing
Frameworks Through Hooks (TFTH). The techniques are also
compared analytically in terms of their coverage power of objectoriented
aspects. The comparison study results show that the TFTH
technique is better than the New Framework Test Approach in terms
of usefulness degree, uniqueness degree, and coverage power.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: In this paper, based on linear matrix inequality (LMI), by using Lyapunov functional theory, the exponential stability criterion is obtained for a class of uncertain Takagi-Sugeno fuzzy Hopfield neural networks (TSFHNNs) with time delays. Here we choose a generalized Lyapunov functional and introduce a parameterized model transformation with free weighting matrices to it, these techniques lead to generalized and less conservative stability condition that guarantee the wide stability region. Finally, an example is given to illustrate our results by using MATLAB LMI toolbox.
Abstract: Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optmize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is also able to automatically suggest a strategy for number of classes optimization.The tool is used to classify macroeconomic data that report the most developed countries? import and export. It is possible to classify the countries based on their economic behaviour and use an ad hoc tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation.