Abstract: Recently, the issue of machine condition monitoring
and fault diagnosis as a part of maintenance system became global
due to the potential advantages to be gained from reduced
maintenance costs, improved productivity and increased machine
availability. The aim of this work is to investigate the effectiveness
of a new fault diagnosis method based on power spectral density
(PSD) of vibration signals in combination with decision trees and
fuzzy inference system (FIS). To this end, a series of studies was
conducted on an external gear hydraulic pump. After a test under
normal condition, a number of different machine defect conditions
were introduced for three working levels of pump speed (1000, 1500,
and 2000 rpm), corresponding to (i) Journal-bearing with inner face
wear (BIFW), (ii) Gear with tooth face wear (GTFW), and (iii)
Journal-bearing with inner face wear plus Gear with tooth face wear
(B&GW). The features of PSD values of vibration signal were
extracted using descriptive statistical parameters. J48 algorithm is
used as a feature selection procedure to select pertinent features from
data set. The output of J48 algorithm was employed to produce the
crisp if-then rule and membership function sets. The structure of FIS
classifier was then defined based on the crisp sets. In order to
evaluate the proposed PSD-J48-FIS model, the data sets obtained
from vibration signals of the pump were used. Results showed that
the total classification accuracy for 1000, 1500, and 2000 rpm
conditions were 96.42%, 100%, and 96.42% respectively. The results
indicate that the combined PSD-J48-FIS model has the potential for
fault diagnosis of hydraulic pumps.
Abstract: This paper discusses the Urdu script characteristics,
Urdu Nastaleeq and a simple but a novel and robust technique to
recognize the printed Urdu script without a lexicon. Urdu being a
family of Arabic script is cursive and complex script in its nature, the
main complexity of Urdu compound/connected text is not its
connections but the forms/shapes the characters change when it is
placed at initial, middle or at the end of a word. The characters
recognition technique presented here is using the inherited
complexity of Urdu script to solve the problem. A word is scanned
and analyzed for the level of its complexity, the point where the level
of complexity changes is marked for a character, segmented and
feeded to Neural Networks. A prototype of the system has been
tested on Urdu text and currently achieves 93.4% accuracy on the
average.
Abstract: This paper presents a system for tracking the movement of laparoscopic instruments which is based on an orthogonal system of webcams and video image processing. The movements are captured with two webcams placed orthogonally inside of the physical trainer. On the image, the instruments were detected by using color markers placed on the distal tip of each instrument. The 3D position of the tip of the instrument within the work space was obtained by linear triangulation method. Preliminary results showed linearity and repeatability in the motion tracking with a resolution of 0.616 mm in each axis; the accuracy of the system showed a 3D instrument positioning error of 1.009 ± 0.101 mm. This tool is a portable and low-cost alternative to traditional tracking devices and a trustable method for the objective evaluation of the surgeon’s surgical skills.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: This paper describes new computer vision algorithms
that have been developed to track moving objects as part of a
long-term study into the design of (semi-)autonomous vehicles. We
present the results of a study to exploit variable kernels for tracking in
video sequences. The basis of our work is the mean shift
object-tracking algorithm; for a moving target, it is usual to define a
rectangular target window in an initial frame, and then process the data
within that window to separate the tracked object from the background
by the mean shift segmentation algorithm. Rather than use the
standard, Epanechnikov kernel, we have used a kernel weighted by the
Chamfer distance transform to improve the accuracy of target
representation and localization, minimising the distance between the
two distributions in RGB color space using the Bhattacharyya
coefficient. Experimental results show the improved tracking
capability and versatility of the algorithm in comparison with results
using the standard kernel. These algorithms are incorporated as part of
a robot test-bed architecture which has been used to demonstrate their
effectiveness.
Abstract: Support vector machines (SVMs) have shown
superior performance compared to other machine learning techniques,
especially in classification problems. Yet one limitation of SVMs is
the lack of an explanation capability which is crucial in some
applications, e.g. in the medical and security domains. In this paper, a
novel approach for eclectic rule-extraction from support vector
machines is presented. This approach utilizes the knowledge acquired
by the SVM and represented in its support vectors as well as the
parameters associated with them. The approach includes three stages;
training, propositional rule-extraction and rule quality evaluation.
Results from four different experiments have demonstrated the value
of the approach for extracting comprehensible rules of high accuracy
and fidelity.
Abstract: In this paper we examine the use of global texture analysis based approaches for the purpose of Persian font recognition in machine-printed document images. Most existing methods for font recognition make use of local typographical features and connected component analysis. However derivation of such features is not an easy task. Gabor filters are appropriate tools for texture analysis and are motivated by human visual system. Here we consider document images as textures and use Gabor filter responses for identifying the fonts. The method is content independent and involves no local feature analysis. Two different classifiers Weighted Euclidean Distance and SVM are used for the purpose of classification. Experiments on seven different type faces and four font styles show average accuracy of 85% with WED and 82% with SVM classifier over typefaces
Abstract: Automatic detection of bleeding is of practical
importance since capsule endoscopy produces an extremely large
number of images. Algorithm development of bleeding detection in
the digestive tract is difficult due to different contrasts among the
images, food dregs, secretion and others. In this study, were assigned
weighting factors derived from the independent features of the
contrast and brightness between bleeding and normality. Spectral
analysis based on weighting factors was fast and accurate. Results
were a sensitivity of 87% and a specificity of 90% when the accuracy
was determined for each pixel out of 42 endoscope images.
Abstract: Motion estimation is a key problem in video
processing and computer vision. Optical flow motion estimation can
achieve high estimation accuracy when motion vector is small.
Three-step search algorithm can handle large motion vector but not
very accurate. A joint algorithm was proposed in this paper to
achieve high estimation accuracy disregarding whether the motion
vector is small or large, and keep the computation cost much lower
than full search.
Abstract: One of the most important causes of accidents is
driver fatigue. To reduce the accidental rate, the driver needs a
quick nap when feeling sleepy. Hence, searching for the minimum
time period of nap is a very challenging problem. The purpose of
this paper is twofold, i.e. to investigate the possible fastest time
period for nap and its relationship with stage 2 sleep, and to
develop an automatic stage 2 sleep detection and alarm device. The
experiment for this investigation is designed with 21 subjects. It
yields the result that waking up the subjects after getting into stage
2 sleep for 3-5 minutes can efficiently reduce the sleepiness.
Furthermore, the automatic stage 2 sleep detection and alarm
device yields the real-time detection accuracy of approximately
85% which is comparable with the commercial sleep lab system.
Abstract: This paper discusses a method for improving accuracy
of fuzzy-rule-based classifiers using particle swarm optimization
(PSO). Two different fuzzy classifiers are considered and optimized.
The first classifier is based on Mamdani fuzzy inference system
(M_PSO fuzzy classifier). The second classifier is based on Takagi-
Sugeno fuzzy inference system (TS_PSO fuzzy classifier). The
parameters of the proposed fuzzy classifiers including premise
(antecedent) parameters, consequent parameters and structure of
fuzzy rules are optimized using PSO. Experimental results show that
higher classification accuracy can be obtained with a lower number
of fuzzy rules by using the proposed PSO fuzzy classifiers. The
performances of M_PSO and TS_PSO fuzzy classifiers are compared
to other fuzzy based classifiers
Abstract: Iris localization is a very important approach in
biometric identification systems. Identification process usually is
implemented in three levels: iris localization, feature extraction, and
pattern matching finally. Accuracy of iris localization as the first step
affects all other levels and this shows the importance of iris
localization in an iris based biometric system. In this paper, we
consider Daugman iris localization method as a standard method,
propose a new method in this field and then analyze and compare the
results of them on a standard set of iris images. The proposed method
is based on the detection of circular edge of iris, and improved by
fuzzy circles and surface energy difference contexts. Implementation
of this method is so easy and compared to the other methods, have a
rather high accuracy and speed. Test results show that the accuracy of
our proposed method is about Daugman method and computation
speed of it is 10 times faster.
Abstract: This paper presents the elastic buckling of
homogeneous beams with a pair of piezoelectric layers surface
bonded on both sides of the beams. The displacement field of beam is
assumed based on the Engesser-Timoshenko beam theory.
Applying the Hamilton's principle, the equilibrium equation is
established. The influences of applied voltage, dimensionless
geometrical parameter and piezoelectric thickness on the critical
buckling load of beam are presented. To investigate the accuracy of
the present analysis, a compression study is carried out with a known
data.
Abstract: Artificial Intelligence (AI) methods are increasingly being used for problem solving. This paper concerns using AI-type learning machines for power quality problem, which is a problem of general interest to power system to provide quality power to all appliances. Electrical power of good quality is essential for proper operation of electronic equipments such as computers and PLCs. Malfunction of such equipment may lead to loss of production or disruption of critical services resulting in huge financial and other losses. It is therefore necessary that critical loads be supplied with electricity of acceptable quality. Recognition of the presence of any disturbance and classifying any existing disturbance into a particular type is the first step in combating the problem. In this work two classes of AI methods for Power quality data mining are studied: Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs). We show that SVMs are superior to ANNs in two critical respects: SVMs train and run an order of magnitude faster; and SVMs give higher classification accuracy.
Abstract: Snow cover is an important phenomenon in
hydrology, hence modeling the snow accumulation and melting is an
important issue in places where snowmelt significantly contributes to
runoff and has significant effect on water balance. The physics-based
models are invariably distributed, with the basin disaggregated into
zones or grid cells. Satellites images provide valuable data to verify
the accuracy of spatially distributed model outputs. In this study a
spatially distributed physically based model (WetSpa) was applied to
predict snow cover and melting in the Latyan dam watershed in Iran.
Snowmelt is simulated based on an energy balance approach. The
model is applied and calibrated with one year of observed daily
precipitation, air temperature, windspeed, and daily potential
evaporation. The predicted snow-covered area is compared with
remotely sensed images (MODIS). The results show that simulated
snow cover area SCA has a good agreement with satellite image
snow cover area SCA from MODIS images. The model performance
is also tested by statistical and graphical comparison of simulated and
measured discharges entering the Latyan dam reservoir.
Abstract: Nowadays, the rapid development of multimedia
and internet allows for wide distribution of digital media data.
It becomes much easier to edit, modify and duplicate digital
information Besides that, digital documents are also easy to
copy and distribute, therefore it will be faced by many
threatens. It-s a big security and privacy issue with the large
flood of information and the development of the digital
format, it become necessary to find appropriate protection
because of the significance, accuracy and sensitivity of the
information. Nowadays protection system classified with more
specific as hiding information, encryption information, and
combination between hiding and encryption to increase information
security, the strength of the information hiding science is due to the
non-existence of standard algorithms to be used in hiding secret
messages. Also there is randomness in hiding methods such as
combining several media (covers) with different methods to pass a
secret message. In addition, there are no formal methods to be
followed to discover the hidden data. For this reason, the task of this
research becomes difficult. In this paper, a new system of information
hiding is presented. The proposed system aim to hidden information
(data file) in any execution file (EXE) and to detect the hidden file
and we will see implementation of steganography system which
embeds information in an execution file. (EXE) files have been
investigated. The system tries to find a solution to the size of the
cover file and making it undetectable by anti-virus software. The
system includes two main functions; first is the hiding of the
information in a Portable Executable File (EXE), through the
execution of four process (specify the cover file, specify the
information file, encryption of the information, and hiding the
information) and the second function is the extraction of the hiding
information through three process (specify the steno file, extract the
information, and decryption of the information). The system has
achieved the main goals, such as make the relation of the size of the
cover file and the size of information independent and the result file
does not make any conflict with anti-virus software.
Abstract: This study analyzes the effect of discretization on
classification of datasets including continuous valued features. Six
datasets from UCI which containing continuous valued features are
discretized with entropy-based discretization method. The
performance improvement between the dataset with original features
and the dataset with discretized features is compared with k-nearest
neighbors, Naive Bayes, C4.5 and CN2 data mining classification
algorithms. As the result the classification accuracies of the six
datasets are improved averagely by 1.71% to 12.31%.
Abstract: In this study, direct numerical simulation for the bubble condensation in the subcooled boiling flow was performed. The main goal was to develop the CFD modeling for the bubble condensation and to evaluate the accuracy of the VOF model with the developed CFD modeling. CFD modeling for the bubble condensation was developed by modeling the source terms in the governing equations of VOF model using UDF. In the modeling, the amount of condensation was determined using the interfacial heat transfer coefficient obtained from the bubble velocity, liquid temperature and bubble diameter every time step. To evaluate the VOF model using the CFD modeling for the bubble condensation, CFD simulation results were compared with SNU experimental results such as bubble volume and shape, interfacial area, bubble diameter and bubble velocity. Simulation results predicted well the behavior of the actual condensing bubble. Therefore, it can be concluded that the VOF model using the CFD modeling for the bubble condensation will be a useful computational fluid dynamics tool for analyzing the behavior of the condensing bubble in a wide range of the subcooled boiling flow.
Abstract: This paper introduces the application of seismic wave method in earthquake prediction and early estimation. The advantages of the seismic wave method over the traditional earthquake prediction method are demonstrated. An example is presented in this study to show the accuracy and efficiency of using the seismic wave method in predicting a medium-sized earthquake swarm occurred in Wencheng, Zhejiang, China. By applying this method, correct predictions were made on the day after this earthquake swarm started and the day the maximum earthquake occurred, which provided scientific bases for governmental decision-making.
Abstract: Nowadays predicting political risk level of country
has become a critical issue for investors who intend to achieve
accurate information concerning stability of the business
environments. Since, most of the times investors are layman and
nonprofessional IT personnel; this paper aims to propose a
framework named GECR in order to help nonexpert persons to
discover political risk stability across time based on the political
news and events.
To achieve this goal, the Bayesian Networks approach was
utilized for 186 political news of Pakistan as sample dataset.
Bayesian Networks as an artificial intelligence approach has been
employed in presented framework, since this is a powerful technique
that can be applied to model uncertain domains. The results showed
that our framework along with Bayesian Networks as decision
support tool, predicted the political risk level with a high degree of
accuracy.