Abstract: This paper presents a distributed intrusion
detection system IDS, based on the concept of specialized
distributed agents community representing agents with the
same purpose for detecting distributed attacks. The semantic of
intrusion events occurring in a predetermined network has been
defined. The correlation rules referring the process which our
proposed IDS combines the captured events that is distributed
both spatially and temporally. And then the proposed IDS tries
to extract significant and broad patterns for set of well-known
attacks. The primary goal of our work is to provide intrusion
detection and real-time prevention capability against insider
attacks in distributed and fully automated environments.
Abstract: One of the biggest problems of SMEs is their tendencies to financial distress because of insufficient finance background. In this study, an Early Warning System (EWS) model based on data mining for financial risk detection is presented. CHAID algorithm has been used for development of the EWS. Developed EWS can be served like a tailor made financial advisor in decision making process of the firms with its automated nature to the ones who have inadequate financial background. Besides, an application of the model implemented which covered 7,853 SMEs based on Turkish Central Bank (TCB) 2007 data. By using EWS model, 31 risk profiles, 15 risk indicators, 2 early warning signals, and 4 financial road maps has been determined for financial risk mitigation.
Abstract: Cryo-electron microscopy (CEM) in combination with
single particle analysis (SPA) is a widely used technique for
elucidating structural details of macromolecular assemblies at closeto-
atomic resolutions. However, development of automated software
for SPA processing is still vital since thousands to millions of
individual particle images need to be processed. Here, we present our
workflow for automated particle picking. Our approach integrates
peak shape analysis to the classical correlation and an iterative
approach to separate macromolecules and background by
classification. This particle selection workflow furthermore provides
a robust means for SPA with little user interaction. Processing
simulated and experimental data assesses performance of the
presented tools.
Abstract: Leprosy is an infectious disease caused by
Mycobacterium Leprae, this disease, generally, compromises
the neural fibers, leading to the development of disability.
Disabilities are changes that limit daily activities or social life
of a normal individual. When comes to leprosy, the study of
disability considered the functional limitation (physical
disabilities), the limitation of activity and social participation,
which are measured respectively by the scales: EHF, SALSA
and PARTICIPATION SCALE. The objective of this work is
to propose an on-line monitoring of leprosy patients, which is
based on information scales EHF, SALSA and
PARTICIPATION SCALE. It is expected that the proposed
system is applied in monitoring the patient during treatment
and after healing therapy of the disease. The correlations that
the system is between the scales create a variety of
information, presented the state of the patient and full of
changes or reductions in disability. The system provides
reports with information from each of the scales and the
relationships that exist between them. This way, health
professionals, with access to patient information, can
intervene with techniques for the Prevention of Disability.
Through the automated scale, the system shows the level of
the patient and allows the patient, or the responsible, to take a
preventive measure. With an online system, it is possible take
the assessments and monitor patients from anywhere.
Abstract: An automated wood recognition system is designed to
classify tropical wood species.The wood features are extracted based
on two feature extractors: Basic Grey Level Aura Matrix (BGLAM)
technique and statistical properties of pores distribution (SPPD)
technique. Due to the nonlinearity of the tropical wood species
separation boundaries, a pre classification stage is proposed which
consists ofKmeans clusteringand kernel discriminant analysis (KDA).
Finally, Linear Discriminant Analysis (LDA) classifier and KNearest
Neighbour (KNN) are implemented for comparison purposes.
The study involves comparison of the system with and without pre
classification using KNN classifier and LDA classifier.The results
show that the inclusion of the pre classification stage has improved
the accuracy of both the LDA and KNN classifiers by more than
12%.
Abstract: This paper presents software tools that convert the C/Cµ floating point source code for a DSP algorithm into a fixedpoint simulation model that can be used to evaluate the numericalperformance of the algorithm on several different fixed pointplatforms including microprocessors, DSPs and FPGAs. The tools use a novel system for maintaining binary point informationso that the conversion from floating point to fixed point isautomated and the resulting fixed point algorithm achieves maximum possible precision. A configurable architecture is used during the simulation phase so that the algorithm can produce a bit-exact output for several different target devices.
Abstract: This paper proposes a bi-objective model for the
facility location problem under a congestion system. The idea of the
model is motivated by applications of locating servers in bank
automated teller machines (ATMS), communication networks, and so
on. This model can be specifically considered for situations in which
fixed service facilities are congested by stochastic demand within
queueing framework. We formulate this model with two perspectives
simultaneously: (i) customers and (ii) service provider. The
objectives of the model are to minimize (i) the total expected
travelling and waiting time and (ii) the average facility idle-time.
This model represents a mixed-integer nonlinear programming
problem which belongs to the class of NP-hard problems. In addition,
to solve the model, two metaheuristic algorithms including nondominated
sorting genetic algorithms (NSGA-II) and non-dominated
ranking genetic algorithms (NRGA) are proposed. Besides, to
evaluate the performance of the two algorithms some numerical
examples are produced and analyzed with some metrics to determine
which algorithm works better.
Abstract: Software maintenance is extremely important activity in software development life cycle. It involves a lot of human efforts, cost and time. Software maintenance may be further subdivided into different activities such as fault prediction, fault detection, fault prevention, fault correction etc. This topic has gained substantial attention due to sophisticated and complex applications, commercial hardware, clustered architecture and artificial intelligence. In this paper we surveyed the work done in the field of software maintenance. Software fault prediction has been studied in context of fault prone modules, self healing systems, developer information, maintenance models etc. Still a lot of things like modeling and weightage of impact of different kind of faults in the various types of software systems need to be explored in the field of fault severity.
Abstract: In the recent past, there has been an increasing interest
in applying evolutionary methods to Knowledge Discovery in
Databases (KDD) and a number of successful applications of Genetic
Algorithms (GA) and Genetic Programming (GP) to KDD have been
demonstrated. The most predominant representation of the
discovered knowledge is the standard Production Rules (PRs) in the
form If P Then D. The PRs, however, are unable to handle
exceptions and do not exhibit variable precision. The Censored
Production Rules (CPRs), an extension of PRs, were proposed by
Michalski & Winston that exhibit variable precision and supports an
efficient mechanism for handling exceptions. A CPR is an
augmented production rule of the form:
If P Then D Unless C, where C (Censor) is an exception to the rule.
Such rules are employed in situations, in which the conditional
statement 'If P Then D' holds frequently and the assertion C holds
rarely. By using a rule of this type we are free to ignore the exception
conditions, when the resources needed to establish its presence are
tight or there is simply no information available as to whether it
holds or not. Thus, the 'If P Then D' part of the CPR expresses
important information, while the Unless C part acts only as a switch
and changes the polarity of D to ~D.
This paper presents a classification algorithm based on evolutionary
approach that discovers comprehensible rules with exceptions in the
form of CPRs.
The proposed approach has flexible chromosome encoding, where
each chromosome corresponds to a CPR. Appropriate genetic
operators are suggested and a fitness function is proposed that
incorporates the basic constraints on CPRs. Experimental results are
presented to demonstrate the performance of the proposed algorithm.
Abstract: Heart disease (HD) is a major cause of morbidity and mortality in the modern society. Medical diagnosis is an important but complicated task that should be performed accurately and efficiently and its automation would be very useful. All doctors are unfortunately not equally skilled in every sub specialty and they are in many places a scarce resource. A system for automated medical diagnosis would enhance medical care and reduce costs. In this paper, a new approach based on coactive neuro-fuzzy inference system (CANFIS) was presented for prediction of heart disease. The proposed CANFIS model combined the neural network adaptive capabilities and the fuzzy logic qualitative approach which is then integrated with genetic algorithm to diagnose the presence of the disease. The performances of the CANFIS model were evaluated in terms of training performances and classification accuracies and the results showed that the proposed CANFIS model has great potential in predicting the heart disease.
Abstract: Owing to extensive use of hydrogen in refining or
petrochemical units, it is essential to manage hydrogen network in
order to make the most efficient utilization of hydrogen. On the other
hand, hydrogen is an important byproduct not properly used through
petrochemical complexes and mostly sent to the fuel system. A few
works have been reported in literature to improve hydrogen network
for petrochemical complexes. In this study a comprehensive analysis
is carried out on petrochemical units using a modified automated
targeting technique which is applied to determine the minimum
hydrogen consumption. Having applied the modified targeting
method in two petrochemical cases, the results showed a significant
reduction in required fresh hydrogen.
Abstract: Starting from the basic pillars of the supportability
analysis this paper queries its characteristics in LCI (Life Cycle
Integration) environment. The research methodology contents a
review of modern logistics engineering literature with the objective to
collect and synthesize the knowledge relating to standards of
supportability design in e-logistics environment. The results show
that LCI framework has properties which are in fully compatibility
with the requirement of simultaneous logistics support and productservice
bundle design. The proposed approach is a contribution to the
more comprehensive and efficient supportability design process.
Also, contributions are reflected through a greater consistency of
collected data, automated creation of reports suitable for different
analysis, as well as the possibility of their customization according
with customer needs. In addition to this, convenience of this approach
is its practical use in real time. In a broader sense, LCI allows
integration of enterprises on a worldwide basis facilitating electronic
business.
Abstract: With the prevalence of computer and development of information technology, Geographic Information Systems (GIS) have long used for a variety of applications in electrical engineering. GIS are designed to support the analysis, management, manipulation and mapping of spatial data. This paper presents several usages of GIS in power utilities such as automated route selection for the construction of new power lines which uses a dynamic programming model for route optimization, load forecasting and optimizing planning of substation-s location and capacity with comprehensive algorithm which involves an accurate small-area electric load forecasting procedure and simulates the different cost functions of substations.
Abstract: This paper presents the automated methods employed
for extracting craniofacial landmarks in white light images as part of
a registration framework designed to support three neurosurgical
procedures. The intraoperative space is characterised by white light
stereo imaging while the preoperative plan is performed on CT scans.
The registration aims at aligning these two modalities to provide a
calibrated environment to enable image-guided solutions. The
neurosurgical procedures can then be carried out by mapping the
entry and target points from CT space onto the patient-s space. The
registration basis adopted consists of natural landmarks (eye corner
and ear tragus). A 5mm accuracy is deemed sufficient for these three
procedures and the validity of the selected registration basis in
achieving this accuracy has been assessed by simulation studies. The
registration protocol is briefly described, followed by a presentation
of the automated techniques developed for the extraction of the
craniofacial features and results obtained from tests on the AR and
FERET databases. Since the three targeted neurosurgical procedures
are routinely used for head injury management, the effect of
bruised/swollen faces on the automated algorithms is assessed. A
user-interactive method is proposed to deal with such unpredictable
circumstances.
Abstract: Industrial robots become useless without end-effectors
that for many instances are in the form of friction grippers.
Commonly friction grippers apply frictional forces to different
objects on the basis of programmers- experiences. This puts a
limitation on the effectiveness of gripping force that may result in
damaging the object. This paper describes various stages of design
and development of a low cost sensor-based robotic gripper that
would facilitate the task of applying right gripping forces to different
objects. The gripper is also equipped with range sensors in order to
avoid collisions of the gripper with objects. It is a fully functional
automated pick and place gripper which can be used in many
industrial applications. Yet it can also be altered or further developed
in order to suit a larger number of industrial activities. The current
design of gripper could lead to designing completely automated robot
grippers able to improve the efficiency and productivity of industrial
robots.
Abstract: Many contemporary telemedical applications rely on
regular consultations over the phone or video conferencing which
consumes valuable resources such as the time of the doctors. Some
applications or treatments allow automated diagnostics on the patient
side which only notifies the doctors in case a significant worsening
of patient’s condition is measured.
Such programs can save valuable resources but an important
implementation issue is how to ensure effective and cheap diagnostics
on the patient side. First, specific diagnostic devices on patient side
are expensive and second, they need to be user-˜friendly to encourage
patient’s cooperation and reduce errors in usage which may cause
noise in diagnostic data.
This article proposes the use of modern smartphones and various
build-in or attachable sensors as universal diagnostic devices applicable
in a wider range of telemedical programs and demonstrates their
application on a case-study – a program for schizophrenic relapse
prevention.
Abstract: The emergence of the Internet has brewed the
revolution of information storage and retrieval. As most of the
data in the web is unstructured, and contains a mix of text,
video, audio etc, there is a need to mine information to cater to
the specific needs of the users without loss of important
hidden information. Thus developing user friendly and
automated tools for providing relevant information quickly
becomes a major challenge in web mining research. Most of
the existing web mining algorithms have concentrated on
finding frequent patterns while neglecting the less frequent
ones that are likely to contain outlying data such as noise,
irrelevant and redundant data. This paper mainly focuses on
Signed approach and full word matching on the organized
domain dictionary for mining web content outliers. This
Signed approach gives the relevant web documents as well as
outlying web documents. As the dictionary is organized based
on the number of characters in a word, searching and retrieval
of documents takes less time and less space.
Abstract: A dead leg is a typical subsea production system
component. CFD is required to model heat transfer within the dead
leg. Unfortunately its solution is time demanding and thus not
suitable for fast prediction or repeated simulations. Therefore there is
a need to create a thermal FEA model, mimicking the heat flows and
temperatures seen in CFD cool down simulations.
This paper describes the conventional way of tuning and a new
automated way using parametric model order reduction (PMOR)
together with an optimization algorithm. The tuned FE analyses
replicate the steady state CFD parameters within a maximum error in
heat flow of 6 % and 3 % using manual and PMOR method
respectively. During cool down, the relative error of the tuned FEA
models with respect to temperature is below 5% comparing to the
CFD. In addition, the PMOR method obtained the correct FEA setup
five times faster than the manually tuned FEA.
Abstract: The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.
Abstract: In this paper, we propose a novel algorithm for
delineating the endocardial wall from a human heart ultrasound scan.
We assume that the gray levels in the ultrasound images are
independent and identically distributed random variables with
different Rician Inverse Gaussian (RiIG) distributions. Both synthetic
and real clinical data will be used for testing the algorithm. Algorithm
performance will be evaluated using the expert radiologist evaluation
of a soft copy of an ultrasound scan during the scanning process and
secondly, doctor’s conclusion after going through a printed copy of
the same scan. Successful implementation of this algorithm should
make it possible to differentiate normal from abnormal soft tissue and
help disease identification, what stage the disease is in and how best
to treat the patient. We hope that an automated system that uses this
algorithm will be useful in public hospitals especially in Third World
countries where problems such as shortage of skilled radiologists and
shortage of ultrasound machines are common. These public hospitals
are usually the first and last stop for most patients in these countries.