Abstract: We propose a method for discrimination and
classification of ovarian with benign, malignant and normal tissue
using independent component analysis and neural networks. The
method was tested for a proteomic patters set from A database, and
radial basis functions neural networks. The best performance was
obtained with probabilistic neural networks, resulting I 99% success
rate, with 98% of specificity e 100% of sensitivity.
Abstract: This paper aims to improve a fine lapping process of
hard disk drive (HDD) lapping machines by removing materials from
each slider together with controlling the strip height (SH) variation to
minimum value. The standard deviation is the key parameter to
evaluate the strip height variation, hence it is minimized. In this
paper, a design of experiment (DOE) with factorial analysis by twoway
analysis of variance (ANOVA) is adopted to obtain a
statistically information. The statistics results reveal that initial stripe
height patterns affect the final SH variation. Therefore, initial SH
classification using a radial basis function neural network is
implemented to achieve the proportional gain prediction.
Abstract: In order to study seed yield and seed yield
components in bean under reduced irrigation condition and
assessment drought tolerance of genotypes, 15 lines of White beans
were evaluated in two separate RCB design with 3 replications under
stress and non stress conditions. Analysis of variance showed that
there were significant differences among varieties in terms of traits
under study, indicating the existence of genetic variation among
varieties. The results indicate that drought stress reduced seed yield,
number of seed per plant, biological yield and number of pod in
White been. In non stress condition, yield was highly correlated with
the biological yield, whereas in stress condition it was highly
correlated with harvest index. Results of stepwise regression showed
that, selection can we done based on, biological yield, harvest index,
number of seed per pod, seed length, 100 seed weight. Result of path
analysis showed that the highest direct effect, being positive, was
related to biological yield in non stress and to harvest index in stress
conditions. Factor analysis were accomplished in stress and nonstress
condition a, there were 4 factors that explained more than 76
percent of total variations. We used several selection indices such as
Stress Susceptibility Index ( SSI ), Geometric Mean Productivity (
GMP ), Mean Productivity ( MP ), Stress Tolerance Index ( STI ) and
Tolerance Index ( TOL ) to study drought tolerance of genotypes, we
found that the best Stress Index for selection tolerance genotypes
were STI, GMP and MP were the greatest correlations between these
Indices and seed yield under stress and non stress conditions. In
classification of genotypes base on phenotypic characteristics, using
cluster analysis ( UPGMA ), all allels classified in 5 separate groups
in stress and non stress conditions.
Abstract: In this article, we propose a methodology for the
characterization of the suspended matter along Algiers-s bay. An
approach by multi layers perceptron (MLP) with training by back
propagation of the gradient optimized by the algorithm of Levenberg
Marquardt (LM) is used. The accent was put on the choice of the
components of the base of training where a comparative study made
for four methods: Random and three alternatives of classification by
K-Means. The samples are taken from suspended matter image,
obtained by analytical model based on polynomial regression by
taking account of in situ measurements. The mask which selects the
zone of interest (water in our case) was carried out by using a multi
spectral classification by ISODATA algorithm. To improve the
result of classification, a cleaning of this mask was carried out using
the tools of mathematical morphology. The results of this study
presented in the forms of curves, tables and of images show the
founded good of our methodology.
Abstract: The ability to detect and classify the type of fault
plays a great role in the protection of power system. This procedure
is required to be precise with no time consumption. In this paper
detection of fault type has been implemented using wavelet analysis
together with wavelet entropy principle. The simulation of power
system is carried out using PSCAD/EMTDC. Different types of
faults were studied obtaining various current waveforms. These
current waveforms were decomposed using wavelet analysis into
different approximation and details. The wavelet entropy of such
decompositions is analyzed reaching a successful methodology for
fault classification. The suggested approach is tested using different
fault types and proven successful identification for the type of fault.
Abstract: For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.
Abstract: In this paper, we propose an approach for the classification of fingerprint databases. It is based on the fact that a fingerprint image is composed of regular texture regions that can be successfully represented by co-occurrence matrices. So, we first extract the features based on certain characteristics of the cooccurrence matrix and then we use these features to train a neural network for classifying fingerprints into four common classes. The obtained results compared with the existing approaches demonstrate the superior performance of our proposed approach.
Abstract: Naive Bayes Nearest Neighbor (NBNN) and its variants, i,e., local NBNN and the NBNN kernels, are local feature-based classifiers that have achieved impressive performance in image classification. By exploiting instance-to-class (I2C) distances (instance means image/video in image/video classification), they avoid quantization errors of local image descriptors in the bag of words (BoW) model. However, the performances of NBNN, local NBNN and the NBNN kernels have not been validated on video analysis. In this paper, we introduce these three classifiers into human action recognition and conduct comprehensive experiments on the benchmark KTH and the realistic HMDB datasets. The results shows that those I2C based classifiers consistently outperform the SVM classifier with the BoW model.
Abstract: Traffic incident has bad effect on all parts of society
so controlling road networks with enough traffic devices could help
to decrease number of accidents, so using the best method for
optimum site selection of these devices could help to implement good
monitoring system. This paper has considered here important criteria
for optimum site selection of traffic camera based on aggregation
methods such as Bagging and Dempster-Shafer concepts. In the first
step, important criteria such as annual traffic flow, distance from
critical places such as parks that need more traffic controlling were
identified for selection of important road links for traffic camera
installation, Then classification methods such as Artificial neural
network and Decision tree algorithms were employed for
classification of road links based on their importance for camera
installation. Then for improving the result of classifiers aggregation
methods such as Bagging and Dempster-Shafer theories were used.
Abstract: This text studies glass bottle intelligent inspector
based machine vision instead of manual inspection. The system
structure is illustrated in detail in this paper. The text presents the
method based on watershed transform methods to segment the
possible defective regions and extract features of bottle wall by rules.
Then wavelet transform are used to exact features of bottle finish
from images. After extracting features, the fuzzy support vector
machine ensemble is putted forward as classifier. For ensuring that
the fuzzy support vector machines have good classification ability,
the GA based ensemble method is used to combining the several
fuzzy support vector machines. The experiments demonstrate that
using this inspector to inspect glass bottles, the accuracy rate may
reach above 97.5%.
Abstract: In this paper, a strategy for long-span bridge disaster response was developed, divided into risk analysis, business impact analysis, and emergency response plan. At the risk analysis stage, the critical risk was estimated. The critical risk was “car accident."The critical process by critical-risk classification was assessed at the business impact analysis stage. The critical process was the task related to the road conditions and traffic safety. Based on the results of the precedent analysis, an emergency response plan was established. By making the order of the standard operating procedures clear, an effective plan for dealing with disaster was formulated. Finally, a prototype software was developed based on the research findings. This study laid the foundation of an information-technology-based disaster response guideline and is significant in that it computerized the disaster response plan to improve the plan-s accessibility.
Abstract: Recently many research has been conducted to
retrieve pertinent parameters and adequate models for automatic
music genre classification. In this paper, two measures based upon
information theory concepts are investigated for mapping the features
space to decision space. A Gaussian Mixture Model (GMM) is used
as a baseline and reference system. Various strategies are proposed
for training and testing sessions with matched or mismatched
conditions, long training and long testing, long training and short
testing. For all experiments, the file sections used for testing are
never been used during training. With matched conditions all
examined measures yield the best and similar scores (almost 100%).
With mismatched conditions, the proposed measures yield better
scores than the GMM baseline system, especially for the short testing
case. It is also observed that the average discrimination information
measure is most appropriate for music category classifications and on
the other hand the divergence measure is more suitable for music
subcategory classifications.
Abstract: In this study, a fuzzy similarity approach for Arabic web pages classification is presented. The approach uses a fuzzy term-category relation by manipulating membership degree for the training data and the degree value for a test web page. Six measures are used and compared in this study. These measures include: Einstein, Algebraic, Hamacher, MinMax, Special case fuzzy and Bounded Difference approaches. These measures are applied and compared using 50 different Arabic web-pages. Einstein measure was gave best performance among the other measures. An analysis of these measures and concluding remarks are drawn in this study.
Abstract: In this paper a one-dimension Self Organizing Map
algorithm (SOM) to perform feature selection is presented. The
algorithm is based on a first classification of the input dataset on a
similarity space. From this classification for each class a set of
positive and negative features is computed. This set of features is
selected as result of the procedure. The procedure is evaluated on an
in-house dataset from a Knowledge Discovery from Text (KDT)
application and on a set of publicly available datasets used in
international feature selection competitions. These datasets come
from KDT applications, drug discovery as well as other applications.
The knowledge of the correct classification available for the training
and validation datasets is used to optimize the parameters for positive
and negative feature extractions. The process becomes feasible for
large and sparse datasets, as the ones obtained in KDT applications,
by using both compression techniques to store the similarity matrix
and speed up techniques of the Kohonen algorithm that take
advantage of the sparsity of the input matrix. These improvements
make it feasible, by using the grid, the application of the
methodology to massive datasets.
Abstract: This paper presents the work of signal discrimination
specifically for Electrocardiogram (ECG) waveform. ECG signal is
comprised of P, QRS, and T waves in each normal heart beat to
describe the pattern of heart rhythms corresponds to a specific
individual. Further medical diagnosis could be done to determine any
heart related disease using ECG information. The emphasis on QRS
Complex classification is further discussed to illustrate the
importance of it. Pan-Tompkins Algorithm, a widely known
technique has been adapted to realize the QRS Complex
classification process. There are eight steps involved namely
sampling, normalization, low pass filter, high pass filter (build a band
pass filter), derivation, squaring, averaging and lastly is the QRS
detection. The simulation results obtained is represented in a
Graphical User Interface (GUI) developed using MATLAB.
Abstract: The Neuro-Fuzzy hybridization scheme has become
of research interest in pattern classification over the past decade. The
present paper proposes a novel Modified Adaptive Fuzzy Inference
Engine (MAFIE) for pattern classification. A modified Apriori
algorithm technique is utilized to reduce a minimal set of decision
rules based on input output data sets. A TSK type fuzzy inference
system is constructed by the automatic generation of membership
functions and rules by the fuzzy c-means clustering and Apriori
algorithm technique, respectively. The generated adaptive fuzzy
inference engine is adjusted by the least-squares fit and a conjugate
gradient descent algorithm towards better performance with a
minimal set of rules. The proposed MAFIE is able to reduce the
number of rules which increases exponentially when more input
variables are involved. The performance of the proposed MAFIE is
compared with other existing applications of pattern classification
schemes using Fisher-s Iris and Wisconsin breast cancer data sets and
shown to be very competitive.
Abstract: This paper presents a new approach for the protection
of Thyristor-Controlled Series Compensator (TCSC) line using
Support Vector Machine (SVM). One SVM is trained for fault
classification and another for section identification. This method use
three phase current measurement that results in better speed and
accuracy than other SVM based methods which used single phase
current measurement. This makes it suitable for real-time protection.
The method was tested on 10,000 data instances with a very wide
variation in system conditions such as compensation level, source
impedance, location of fault, fault inception angle, load angle at
source bus and fault resistance. The proposed method requires only
local current measurement.
Abstract: Conceptualization strengthens intelligent systems in generalization skill, effective knowledge representation, real-time inference, and managing uncertain and indefinite situations in addition to facilitating knowledge communication for learning agents situated in real world. Concept learning introduces a way of abstraction by which the continuous state is formed as entities called concepts which are connected to the action space and thus, they illustrate somehow the complex action space. Of computational concept learning approaches, action-based conceptualization is favored because of its simplicity and mirror neuron foundations in neuroscience. In this paper, a new biologically inspired concept learning approach based on the probabilistic framework is proposed. This approach exploits and extends the mirror neuron-s role in conceptualization for a reinforcement learning agent in nondeterministic environments. In the proposed method, instead of building a huge numerical knowledge, the concepts are learnt gradually from rewards through interaction with the environment. Moreover the probabilistic formation of the concepts is employed to deal with uncertain and dynamic nature of real problems in addition to the ability of generalization. These characteristics as a whole distinguish the proposed learning algorithm from both a pure classification algorithm and typical reinforcement learning. Simulation results show advantages of the proposed framework in terms of convergence speed as well as generalization and asymptotic behavior because of utilizing both success and failures attempts through received rewards. Experimental results, on the other hand, show the applicability and effectiveness of the proposed method in continuous and noisy environments for a real robotic task such as maze as well as the benefits of implementing an incremental learning scenario in artificial agents.
Abstract: As a popular rank-reduced vector space approach,
Latent Semantic Indexing (LSI) has been used in information
retrieval and other applications. In this paper, an LSI-based content
vector model for text classification is presented, which constructs
multiple augmented category LSI spaces and classifies text by their
content. The model integrates the class discriminative information
from the training data and is equipped with several pertinent feature
selection and text classification algorithms. The proposed classifier
has been applied to email classification and its experiments on a
benchmark spam testing corpus (PU1) have shown that the approach
represents a competitive alternative to other email classifiers based
on the well-known SVM and naïve Bayes algorithms.
Abstract: A new dynamic clustering approach (DCPSO), based
on Particle Swarm Optimization, is proposed. This approach is
applied to unsupervised image classification. The proposed approach
automatically determines the "optimum" number of clusters and
simultaneously clusters the data set with minimal user interference.
The algorithm starts by partitioning the data set into a relatively large
number of clusters to reduce the effects of initial conditions. Using
binary particle swarm optimization the "best" number of clusters is
selected. The centers of the chosen clusters is then refined via the Kmeans
clustering algorithm. The experiments conducted show that
the proposed approach generally found the "optimum" number of
clusters on the tested images.