Abstract: This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.
Abstract: An important step in studying the statistics of
fingerprint minutia features is to reliably extract minutia features from
the fingerprint images. A new reliable method of computation for
minutiae feature extraction from fingerprint images is presented. A
fingerprint image is treated as a textured image. An orientation flow
field of the ridges is computed for the fingerprint image. To
accurately locate ridges, a new ridge orientation based computation
method is proposed. After ridge segmentation a new method of
computation is proposed for smoothing the ridges. The ridge skeleton
image is obtained and then smoothed using morphological operators
to detect the features. A post processing stage eliminates a large
number of false features from the detected set of minutiae features.
The detected features are observed to be reliable and accurate.
Abstract: A structural study of an aqueous electrolyte whose
experimental results are available. It is a solution of LiCl-6H2O type
at glassy state (120K) contrasted with pure water at room temperature
by means of Partial Distribution Functions (PDF) issue from neutron
scattering technique. Based on these partial functions, the Reverse
Monte Carlo method (RMC) computes radial and angular correlation
functions which allow exploring a number of structural features of
the system. The obtained curves include some artifacts. To remedy
this, we propose to introduce a screened potential as an additional
constraint. Obtained results show a good matching between
experimental and computed functions and a significant improvement
in PDFs curves with potential constraint. It suggests an efficient fit of
pair distribution functions curves.
Abstract: Overcurrent (OC) relays are the major protection
devices in a distribution system. The operating time of the OC relays
are to be coordinated properly to avoid the mal-operation of the
backup relays. The OC relay time coordination in ring fed
distribution networks is a highly constrained optimization problem
which can be stated as a linear programming problem (LPP). The
purpose is to find an optimum relay setting to minimize the time of
operation of relays and at the same time, to keep the relays properly
coordinated to avoid the mal-operation of relays.
This paper presents two phase simplex method for optimum time
coordination of OC relays. The method is based on the simplex
algorithm which is used to find optimum solution of LPP. The
method introduces artificial variables to get an initial basic feasible
solution (IBFS). Artificial variables are removed using iterative
process of first phase which minimizes the auxiliary objective
function. The second phase minimizes the original objective function
and gives the optimum time coordination of OC relays.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: A human verification system is presented in this
paper. The system consists of several steps: background subtraction,
thresholding, line connection, region growing, morphlogy, star
skelatonization, feature extraction, feature matching, and decision
making. The proposed system combines an advantage of star
skeletonization and simple statistic features. A correlation matching
and probability voting have been used for verification, followed by a
logical operation in a decision making stage. The proposed system
uses small number of features and the system reliability is
convincing.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: Bluetooth is a personal wireless communication
technology and is being applied in many scenarios. It is an emerging
standard for short range, low cost, low power wireless access
technology. Current existing MAC (Medium Access Control)
scheduling schemes only provide best-effort service for all masterslave
connections. It is very challenging to provide QoS (Quality of
Service) support for different connections due to the feature of
Master Driven TDD (Time Division Duplex). However, there is no
solution available to support both delay and bandwidth guarantees
required by real time applications. This paper addresses the issue of
how to enhance QoS support in a Bluetooth piconet. The Bluetooth
specification proposes a Round Robin scheduler as possible solution
for scheduling the transmissions in a Bluetooth Piconet. We propose
an algorithm which will reduce the bandwidth waste and enhance the
efficiency of network. We define token counters to estimate traffic of
real-time slaves. To increase bandwidth utilization, a back-off
mechanism is then presented for best-effort slaves to decrease the
frequency of polling idle slaves. Simulation results demonstrate that
our scheme achieves better performance over the Round Robin
scheduling.
Abstract: In this paper we study a system composed by carbon
nanotube (CNT) and bundle of carbon nanotube (BuCNT) interacting
with a specific fatty acid as molecular probe. Full system is
represented by open nanotube (or nanotubes) and the linoleic acid
(LA) relaxing due the interaction with CNT and BuCNT. The LA has
in his form an asymmetric shape with COOH termination provoking
a close BuCNT interaction mainly by van der Waals force field. The
simulations were performed by classical molecular dynamics with
standard parameterizations.
Our results show that these BuCNT and CNT are dynamically
stable and it shows a preferential interaction position with LA
resulting in three features: (i) when the LA is interacting with CNT
and BuCNT (including both termination, CH2 or COOH), the LA is
repelled; (ii) when the LA terminated with CH2 is closer to open
extremity of BuCNT, the LA is also repelled by the interaction
between them; and (iii) when the LA terminated with COOH is
closer to open extremity of BuCNT, the LA is encapsulated by the
BuCNT. These simulations are part of a more extensive work on
searching efficient selective molecular devices and could be useful to
reach this goal.
Abstract: Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Abstract: This paper presents image compression with wavelet based method. The wavelet transformation divides image to low- and high pass filtered parts. The traditional JPEG compression technique requires lower computation power with feasible losses, when only compression is needed. However, there is obvious need for wavelet based methods in certain circumstances. The methods are intended to the applications in which the image analyzing is done parallel with compression. Furthermore, high frequency bands can be used to detect changes or edges. Wavelets enable hierarchical analysis for low pass filtered sub-images. The first analysis can be done for a small image, and only if any interesting is found, the whole image is processed or reconstructed.
Abstract: The social force model which belongs to the
microscopic pedestrian studies has been considered as the supremacy
by many researchers and due to the main feature of reproducing the
self-organized phenomena resulted from pedestrian dynamic. The
Preferred Force which is a measurement of pedestrian-s motivation to
adapt his actual velocity to his desired velocity is an essential term on
which the model was set up. This Force has gone through stages of
development: first of all, Helbing and Molnar (1995) have modeled
the original force for the normal situation. Second, Helbing and his
co-workers (2000) have incorporated the panic situation into this
force by incorporating the panic parameter to account for the panic
situations. Third, Lakoba and Kaup (2005) have provided the
pedestrians some kind of intelligence by incorporating aspects of the
decision-making capability. In this paper, the authors analyze the
most important incorporations into the model regarding the preferred
force. They make comparisons between the different factors of these
incorporations. Furthermore, to enhance the decision-making ability
of the pedestrians, they introduce additional features such as the
familiarity factor to the preferred force to let it appear more
representative of what actually happens in reality.
Abstract: Workload and resource management are two essential functions provided at the service level of the grid software infrastructure. To improve the global throughput of these software environments, workloads have to be evenly scheduled among the available resources. To realize this goal several load balancing strategies and algorithms have been proposed. Most strategies were developed in mind, assuming homogeneous set of sites linked with homogeneous and fast networks. However for computational grids we must address main new issues, namely: heterogeneity, scalability and adaptability. In this paper, we propose a layered algorithm which achieve dynamic load balancing in grid computing. Based on a tree model, our algorithm presents the following main features: (i) it is layered; (ii) it supports heterogeneity and scalability; and, (iii) it is totally independent from any physical architecture of a grid.
Abstract: It is sometimes difficult to differentiate between
innocent murmurs and pathological murmurs during auscultation. In
these difficult cases, an intelligent stethoscope with decision support
abilities would be of great value. In this study, using a dog model,
phonocardiographic recordings were obtained from 27 boxer dogs
with various degrees of aortic stenosis (AS) severity. As a reference
for severity assessment, continuous wave Doppler was used. The data
were analyzed with recurrence quantification analysis (RQA) with
the aim to find features able to distinguish innocent murmurs from
murmurs caused by AS. Four out of eight investigated RQA features
showed significant differences between innocent murmurs and
pathological murmurs. Using a plain linear discriminant analysis
classifier, the best pair of features (recurrence rate and entropy)
resulted in a sensitivity of 90% and a specificity of 88%. In
conclusion, RQA provide valid features which can be used for
differentiation between innocent murmurs and murmurs caused by
AS.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: Some methodologies were compared in providing
erosion maps of surface, rill and gully and erosion features, in
research which took place in the Varamin sub-basin, north-east
Tehran, Iran. A photomorphic unit map was produced from
processed satellite images, and four other maps were prepared by the
integration of different data layers, including slope, plant cover,
geology, land use, rocks erodibility and land units. Comparison of
ground truth maps of erosion types and working unit maps indicated
that the integration of land use, land units and rocks erodibility layers
with satellite image photomorphic units maps provide the best
methods in producing erosion types maps.
Abstract: Gasoline Octane Number is the standard measure of
the anti-knock properties of a motor in platforming processes, that is
one of the important unit operations for oil refineries and can be
determined with online measurement or use CFR (Cooperative Fuel
Research) engines. Online measurements of the Octane number can
be done using direct octane number analyzers, that it is too
expensive, so we have to find feasible analyzer, like ANFIS
estimators.
ANFIS is the systems that neural network incorporated in fuzzy
systems, using data automatically by learning algorithms of NNs.
ANFIS constructs an input-output mapping based both on human
knowledge and on generated input-output data pairs.
In this research, 31 industrial data sets are used (21 data for training
and the rest of the data used for generalization). Results show that,
according to this simulation, hybrid method training algorithm in
ANFIS has good agreements between industrial data and simulated
results.
Abstract: One-way functions are functions that are easy to
compute but hard to invert. Their existence is an open conjecture; it
would imply the existence of intractable problems (i.e. NP-problems
which are not in the P complexity class).
If true, the existence of one-way functions would have an impact
on the theoretical framework of physics, in particularly, quantum
mechanics. Such aspect of one-way functions has never been shown
before.
In the present work, we put forward the following.
We can calculate the microscopic state (say, the particle spin in the
z direction) of a macroscopic system (a measuring apparatus
registering the particle z-spin) by the system macroscopic state (the
apparatus output); let us call this association the function F. The
question is: can we compute the function F in the inverse direction?
In other words, can we compute the macroscopic state of the system
through its microscopic state (the preimage F -1)?
In the paper, we assume that the function F is a one-way function.
The assumption implies that at the macroscopic level the Schrödinger
equation becomes unfeasible to compute. This unfeasibility plays a
role of limit of the validity of the linear Schrödinger equation.