Abstract: One of the most important problems to solve is eye
location for a driver fatigue monitoring system. This paper presents an
efficient method to achieve fast and accurate eye location in grey level
images obtained in the real-word driving conditions. The structure of
eye region is used as a robust cue to find possible eye pairs. Candidates
of eye pair at different scales are selected by finding regions which
roughly match with the binary eye pair template. To obtain real one,
all the eye pair candidates are then verified by using support vector
machines. Finally, eyes are precisely located by using binary vertical
projection and eye classifier in eye pair images. The proposed method
is robust to deal with illumination changes, moderate rotations, glasses
wearing and different eye states. Experimental results demonstrate its
effectiveness.
Abstract: This experiment was conducted in attempt of
improving hydrodynamic efficiency of the propulsion mechanism by
installing a spring to the wing so that the opening angle of the wing in
one stroke can be changed automatically, compared to the existing
method of fixed maximum opening angle in Weis-Fogh type ship
propulsion mechanism. Average thrust coefficient was almost fixed
with all velocity ratio with the prototype, but with the spring type,
thrust coefficient increased sharply as velocity ratio increased.
Average propulsive efficiency was larger with bigger opening angle in
the prototype, but in the spring type, the one with smaller spring
coefficient had larger value. In the range over 1.0 in velocity ratio
where big thrust can be generated, spring type had more than twice of
propulsive efficiency increase compared to the prototype.
Abstract: In this paper, in order to categorize ORL database face
pictures, principle Component Analysis (PCA) and Kernel Principal
Component Analysis (KPCA) methods by using Elman neural
network and Support Vector Machine (SVM) categorization methods
are used. Elman network as a recurrent neural network is proposed
for modeling storage systems and also it is used for reviewing the
effect of using PCA numbers on system categorization precision rate
and database pictures categorization time. Categorization stages are
conducted with various components numbers and the obtained results
of both Elman neural network categorization and support vector
machine are compared. In optimum manner 97.41% recognition
accuracy is obtained.
Abstract: Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: For a given specific problem an efficient algorithm has
been the matter of study. However, an alternative approach orthogonal
to this approach comes out, which is called a reduction. In general
for a given specific problem this reduction approach studies how to
convert an original problem into subproblems. This paper proposes
a formal modeling language to support this reduction approach. We
show three examples from the wide area of learning problems. The
benefit is a fast prototyping of algorithms for a given new problem.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: Color categorization is shared among members in a
society. This allows communication of color, especially when using
natural language such as English. Hence sociable robot, to live
coexist with human in human society, must also have the shared
color categorization. To achieve this, many works have been done
relying on modeling of human color perception and mathematical
complexities. In contrast, in this work, the computer as brain of the
robot learns color categorization through interaction with humans
without much mathematical complexities.
Abstract: Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.
Abstract: In recent years, rapid advances in software and hardware in the field of information technology along with a digital imaging revolution in the medical domain facilitate the generation and storage of large collections of images by hospitals and clinics. To search these large image collections effectively and efficiently poses significant technical challenges, and it raises the necessity of constructing intelligent retrieval systems. Content-based Image Retrieval (CBIR) consists of retrieving the most visually similar images to a given query image from a database of images[5]. Medical CBIR (content-based image retrieval) applications pose unique challenges but at the same time offer many new opportunities. On one hand, while one can easily understand news or sports videos, a medical image is often completely incomprehensible to untrained eyes.
Abstract: Intelligent systems based on machine learning
techniques, such as classification, clustering, are gaining wide spread
popularity in real world applications. This paper presents work on
developing a software system for predicting crop yield, for example
oil-palm yield, from climate and plantation data. At the core of our
system is a method for unsupervised partitioning of data for finding
spatio-temporal patterns in climate data using kernel methods which
offer strength to deal with complex data. This work gets inspiration
from the notion that a non-linear data transformation into some high
dimensional feature space increases the possibility of linear
separability of the patterns in the transformed space. Therefore, it
simplifies exploration of the associated structure in the data. Kernel
methods implicitly perform a non-linear mapping of the input data
into a high dimensional feature space by replacing the inner products
with an appropriate positive definite function. In this paper we
present a robust weighted kernel k-means algorithm incorporating
spatial constraints for clustering the data. The proposed algorithm
can effectively handle noise, outliers and auto-correlation in the
spatial data, for effective and efficient data analysis by exploring
patterns and structures in the data, and thus can be used for
predicting oil-palm yield by analyzing various factors affecting the
yield.
Abstract: Commercial hydroxyapatite (HA) was reinforced by
adding 2, 5, and 10 wt % of 28.5%CaO-28.5%P2O5-38%Na2 O-
5%CaF2 based glass and then sintered. Although HA shows good
biocompatibility with the human body, its applications are limited to
non load-bearing areas and coatings due to its poor mechanical
properties. These mechanical properties can be improved
substantially with addition of glass ceramics by sintering. In this
study, the effects of sintering hydroxyapatite with above specified
phosphate glass additions are quantified. Each composition was
sintered over a range of temperatures. Scanning electron microscopy
and x-ray diffraction were used to characterize the microstructure and
phases of the composites. The density, microhardness, and
compressive strength were measured using Archimedes Principle,
Vickers Microhardness Tester (at 0.98 N), and Instron Universal
Testing Machine (cross speed of 0.5 mm/min) respectively. These
results were used to indicate which composition provided suitable
material for use in hard tissue replacement. Composites containing 10
wt % glass additions formed dense HA/TCP (tricalcium phosphate)
composite materials possessing good compressive strength and
hardness than HA. In-vitro bioactivity was assessed by evaluating
changes in pH and Ca2+ ion concentration of SBF-simulated body
fluid on immersion of these composites in it for two weeks.
Abstract: In the recent past Learning Classifier Systems have
been successfully used for data mining. Learning Classifier System
(LCS) is basically a machine learning technique which combines
evolutionary computing, reinforcement learning, supervised or
unsupervised learning and heuristics to produce adaptive systems. A
LCS learns by interacting with an environment from which it
receives feedback in the form of numerical reward. Learning is
achieved by trying to maximize the amount of reward received. All
LCSs models more or less, comprise four main components; a finite
population of condition–action rules, called classifiers; the
performance component, which governs the interaction with the
environment; the credit assignment component, which distributes the
reward received from the environment to the classifiers accountable
for the rewards obtained; the discovery component, which is
responsible for discovering better rules and improving existing ones
through a genetic algorithm. The concatenate of the production rules
in the LCS form the genotype, and therefore the GA should operate
on a population of classifier systems. This approach is known as the
'Pittsburgh' Classifier Systems. Other LCS that perform their GA at
the rule level within a population are known as 'Mitchigan' Classifier
Systems. The most predominant representation of the discovered
knowledge is the standard production rules (PRs) in the form of IF P
THEN D. The PRs, however, are unable to handle exceptions and do
not exhibit variable precision. The Censored Production Rules
(CPRs), an extension of PRs, were proposed by Michalski and
Winston that exhibit variable precision and supports an efficient
mechanism for handling exceptions. A CPR is an augmented
production rule of the form: IF P THEN D UNLESS C, where
Censor C is an exception to the rule. Such rules are employed in
situations, in which conditional statement IF P THEN D holds
frequently and the assertion C holds rarely. By using a rule of this
type we are free to ignore the exception conditions, when the
resources needed to establish its presence are tight or there is simply
no information available as to whether it holds or not. Thus, the IF P
THEN D part of CPR expresses important information, while the
UNLESS C part acts only as a switch and changes the polarity of D
to ~D. In this paper Pittsburgh style LCSs approach is used for
automated discovery of CPRs. An appropriate encoding scheme is
suggested to represent a chromosome consisting of fixed size set of
CPRs. Suitable genetic operators are designed for the set of CPRs
and individual CPRs and also appropriate fitness function is proposed
that incorporates basic constraints on CPR. Experimental results are
presented to demonstrate the performance of the proposed learning
classifier system.
Abstract: A Comparison and evaluation of the different
condition monitoring (CM) techniques was applied experimentally
on RC e.g. Dynamic cylinder pressure and crankshaft Instantaneous
Angular Speed (IAS), for the detection and diagnosis of valve faults
in a two - stage reciprocating compressor for a programme of
condition monitoring which can successfully detect and diagnose a
fault in machine. Leakage in the valve plate was introduced
experimentally into a two-stage reciprocating compressor. The effect
of the faults on compressor performance was monitored and the
differences with the normal, healthy performance noted as a fault
signature been used for the detection and diagnosis of faults.
The paper concludes with what is considered to be a unique
approach to condition monitoring. First, each of the two most useful
techniques is used to produce a Truth Table which details the
circumstances in which each method can be used to detect and
diagnose a fault. The two Truth Tables are then combined into a
single Decision Table to provide a unique and reliable method of
detection and diagnosis of each of the individual faults introduced
into the compressor. This gives accurate diagnosis of compressor
faults.
Abstract: In the past few years there is a change in the view of high performance applications and parallel computing. Initially such applications were targeted towards dedicated parallel machines. Recently trend is changing towards building meta-applications composed of several modules that exploit heterogeneous platforms and employ hybrid forms of parallelism. The aim of this paper is to propose a model of virtual parallel computing. Virtual parallel computing system provides a flexible object oriented software framework that makes it easy for programmers to write various parallel applications.
Abstract: A simple mobile engine-driven pneumatic paddy
collector made of locally available materials using local
manufacturing technology was designed, fabricated, and tested for
collecting and bagging of paddy dried on concrete pavement. The
pneumatic paddy collector had the following major components:
radial flat bladed type centrifugal fan, power transmission system,
bagging area, frame and the conveyance system. Results showed
significant differences on the collecting capacity, noise level, and fuel
consumption when rotational speed of the air mover shaft was varied.
Other parameters such as collecting efficiency, air velocity,
augmented cracked grain percentage, and germination rate were not
significantly affected by varying rotational speed of the air mover
shaft. The pneumatic paddy collector had a collecting efficiency of
99.33 % with a collecting capacity of 2685.00 kg/h at maximum
rotational speed of centrifugal fan shaft of about 4200 rpm. The
machine entailed an investment cost of P 62,829.25. The break-even
weight of paddy was 510,606.75 kg/yr at a collecting cost of 0.11
P/kg of paddy. Utilizing the machine for 400 hours per year
generated an income of P 23,887.73. The projected time needed to
recover cost of the machine based on 2685 kg/h collecting capacity
was 2.63 year.
Abstract: This paper discusses the designing of knowledge
integration of clinical information extracted from distributed medical
ontologies in order to ameliorate a machine learning-based multilabel
coding assignment system. The proposed approach is
implemented using a decision tree technique of the machine learning
on the university hospital data for patients with Coronary Heart
Disease (CHD). The preliminary results obtained show a satisfactory
finding that the use of medical ontologies improves the overall
system performance.
Abstract: Information sharing and exchange, rather than
information processing, is what characterizes information
technology in the 21st century. Ontologies, as shared common
understanding, gain increasing attention, as they appear as the
most promising solution to enable information sharing both at
a semantic level and in a machine-processable way. Domain
Ontology-based modeling has been exploited to provide
shareability and information exchange among diversified,
heterogeneous applications of enterprises.
Contextual ontologies are “an explicit specification of
contextual conceptualization". That is: ontology is
characterized by concepts that have multiple representations
and they may exist in several contexts. Hence, contextual
ontologies are a set of concepts and relationships, which are
seen from different perspectives. Contextualization is to allow
for ontologies to be partitioned according to their contexts.
The need for contextual ontologies in enterprise modeling
has become crucial due to the nature of today's competitive
market. Information resources in enterprise is distributed and
diversified and is in need to be shared and communicated
locally through the intranet and globally though the internet.
This paper discusses the roles that ontologies play in an
enterprise modeling, and how ontologies assist in building a
conceptual model in order to provide communicative and
interoperable information systems. The issue of enterprise
modeling based on contextual domain ontology is also
investigated, and a framework is proposed for an enterprise
model that consists of various applications.
Abstract: This paper presents a experiment to estimate the
influences of cutting conditions in microstructure changes of
machining austenitic 304 stainless steel, especially for wear insert. The
wear insert were prefabricated with a width of 0.5 mm. And the forces,
temperature distribution, RS, and microstructure changes were
measured by force dynamometer, infrared thermal camera, X-ray
diffraction, XRD, SEM, respectively. The results told that the different
combinations of machining condition have a significant influence on
machined surface microstructure changes. In addition to that, the
ANOVA and AOMwere used to tell the different influences of cutting
speed, feed rate, and wear insert.