Abstract: Identifying and classifying intersections according to
severity is very important for implementation of safety related
counter measures and effective models are needed to compare and
assess the severity. Highway safety organizations have considered
intersection safety among their priorities. In spite of significant
advances in highways safety, the large numbers of crashes with high
severities still occur in the highways. Investigation of influential
factors on crashes enables engineers to carry out calculations in order
to reduce crash severity. Previous studies lacked a model capable of
simultaneous illustration of the influence of human factors, road,
vehicle, weather conditions and traffic features including traffic
volume and flow speed on the crash severity. Thus, this paper is
aimed at developing the models to illustrate the simultaneous
influence of these variables on the crash severity in urban highways.
The models represented in this study have been developed using
binary Logit Models. SPSS software has been used to calibrate the
models. It must be mentioned that backward regression method in
SPSS was used to identify the significant variables in the model.
Consider to obtained results it can be concluded that the main
factor in increasing of crash severity in urban highways are driver
age, movement with reverse gear, technical defect of the vehicle,
vehicle collision with motorcycle and bicycle, bridge, frontal impact
collisions, frontal-lateral collisions and multi-vehicle crashes in
urban highways which always increase the crash severity in urban
highways.
Abstract: One of the most important problems to solve is eye
location for a driver fatigue monitoring system. This paper presents an
efficient method to achieve fast and accurate eye location in grey level
images obtained in the real-word driving conditions. The structure of
eye region is used as a robust cue to find possible eye pairs. Candidates
of eye pair at different scales are selected by finding regions which
roughly match with the binary eye pair template. To obtain real one,
all the eye pair candidates are then verified by using support vector
machines. Finally, eyes are precisely located by using binary vertical
projection and eye classifier in eye pair images. The proposed method
is robust to deal with illumination changes, moderate rotations, glasses
wearing and different eye states. Experimental results demonstrate its
effectiveness.
Abstract: Voice over Internet Protocol (VoIP) application or commonly known as softphone has been developing an increasingly large market in today-s telecommunication world and the trend is expected to continue with the enhancement of additional features. This includes leveraging on the existing presence services, location and contextual information to enable more ubiquitous and seamless communications. In this paper, we discuss the concept of seamless session transfer for real-time application such as VoIP and IPTV, and our prototype implementation of such concept on a selected open source VoIP application. The first part of this paper is about conducting performance evaluation and assessments across some commonly found open source VoIP applications that are Ekiga, Kphone, Linphone and Twinkle so as to identify one of them for implementing our design of seamless session transfer. Subjective testing has been carried out to evaluate the audio performance on these VoIP applications and rank them according to their Mean Opinion Score (MOS) results. The second part of this paper is to discuss on the performance evaluations of our prototype implementation of session transfer using Linphone.
Abstract: Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.
Abstract: The structure of retinal vessels is a prominent feature,
that reveals information on the state of disease that are reflected in
the form of measurable abnormalities in thickness and colour.
Vascular structures of retina, for implementation of clinical diabetic
retinopathy decision making system is presented in this paper.
Retinal Vascular structure is with thin blood vessel, whose accuracy
is highly dependent upon the vessel segmentation. In this paper the
blood vessel thickness is automatically detected using preprocessing
techniques and vessel segmentation algorithm. First the capture
image is binarized to get the blood vessel structure clearly, then it is
skeletonised to get the overall structure of all the terminal and
branching nodes of the blood vessels. By identifying the terminal
node and the branching points automatically, the main and branching
blood vessel thickness is estimated. Results are presented and
compared with those provided by clinical classification on 50 vessels
collected from Bejan Singh Eye hospital..
Abstract: This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.
Abstract: An important step in studying the statistics of
fingerprint minutia features is to reliably extract minutia features from
the fingerprint images. A new reliable method of computation for
minutiae feature extraction from fingerprint images is presented. A
fingerprint image is treated as a textured image. An orientation flow
field of the ridges is computed for the fingerprint image. To
accurately locate ridges, a new ridge orientation based computation
method is proposed. After ridge segmentation a new method of
computation is proposed for smoothing the ridges. The ridge skeleton
image is obtained and then smoothed using morphological operators
to detect the features. A post processing stage eliminates a large
number of false features from the detected set of minutiae features.
The detected features are observed to be reliable and accurate.
Abstract: A structural study of an aqueous electrolyte whose
experimental results are available. It is a solution of LiCl-6H2O type
at glassy state (120K) contrasted with pure water at room temperature
by means of Partial Distribution Functions (PDF) issue from neutron
scattering technique. Based on these partial functions, the Reverse
Monte Carlo method (RMC) computes radial and angular correlation
functions which allow exploring a number of structural features of
the system. The obtained curves include some artifacts. To remedy
this, we propose to introduce a screened potential as an additional
constraint. Obtained results show a good matching between
experimental and computed functions and a significant improvement
in PDFs curves with potential constraint. It suggests an efficient fit of
pair distribution functions curves.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: A human verification system is presented in this
paper. The system consists of several steps: background subtraction,
thresholding, line connection, region growing, morphlogy, star
skelatonization, feature extraction, feature matching, and decision
making. The proposed system combines an advantage of star
skeletonization and simple statistic features. A correlation matching
and probability voting have been used for verification, followed by a
logical operation in a decision making stage. The proposed system
uses small number of features and the system reliability is
convincing.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: Bluetooth is a personal wireless communication
technology and is being applied in many scenarios. It is an emerging
standard for short range, low cost, low power wireless access
technology. Current existing MAC (Medium Access Control)
scheduling schemes only provide best-effort service for all masterslave
connections. It is very challenging to provide QoS (Quality of
Service) support for different connections due to the feature of
Master Driven TDD (Time Division Duplex). However, there is no
solution available to support both delay and bandwidth guarantees
required by real time applications. This paper addresses the issue of
how to enhance QoS support in a Bluetooth piconet. The Bluetooth
specification proposes a Round Robin scheduler as possible solution
for scheduling the transmissions in a Bluetooth Piconet. We propose
an algorithm which will reduce the bandwidth waste and enhance the
efficiency of network. We define token counters to estimate traffic of
real-time slaves. To increase bandwidth utilization, a back-off
mechanism is then presented for best-effort slaves to decrease the
frequency of polling idle slaves. Simulation results demonstrate that
our scheme achieves better performance over the Round Robin
scheduling.
Abstract: In this paper we study a system composed by carbon
nanotube (CNT) and bundle of carbon nanotube (BuCNT) interacting
with a specific fatty acid as molecular probe. Full system is
represented by open nanotube (or nanotubes) and the linoleic acid
(LA) relaxing due the interaction with CNT and BuCNT. The LA has
in his form an asymmetric shape with COOH termination provoking
a close BuCNT interaction mainly by van der Waals force field. The
simulations were performed by classical molecular dynamics with
standard parameterizations.
Our results show that these BuCNT and CNT are dynamically
stable and it shows a preferential interaction position with LA
resulting in three features: (i) when the LA is interacting with CNT
and BuCNT (including both termination, CH2 or COOH), the LA is
repelled; (ii) when the LA terminated with CH2 is closer to open
extremity of BuCNT, the LA is also repelled by the interaction
between them; and (iii) when the LA terminated with COOH is
closer to open extremity of BuCNT, the LA is encapsulated by the
BuCNT. These simulations are part of a more extensive work on
searching efficient selective molecular devices and could be useful to
reach this goal.
Abstract: Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Abstract: The social force model which belongs to the
microscopic pedestrian studies has been considered as the supremacy
by many researchers and due to the main feature of reproducing the
self-organized phenomena resulted from pedestrian dynamic. The
Preferred Force which is a measurement of pedestrian-s motivation to
adapt his actual velocity to his desired velocity is an essential term on
which the model was set up. This Force has gone through stages of
development: first of all, Helbing and Molnar (1995) have modeled
the original force for the normal situation. Second, Helbing and his
co-workers (2000) have incorporated the panic situation into this
force by incorporating the panic parameter to account for the panic
situations. Third, Lakoba and Kaup (2005) have provided the
pedestrians some kind of intelligence by incorporating aspects of the
decision-making capability. In this paper, the authors analyze the
most important incorporations into the model regarding the preferred
force. They make comparisons between the different factors of these
incorporations. Furthermore, to enhance the decision-making ability
of the pedestrians, they introduce additional features such as the
familiarity factor to the preferred force to let it appear more
representative of what actually happens in reality.
Abstract: Workload and resource management are two essential functions provided at the service level of the grid software infrastructure. To improve the global throughput of these software environments, workloads have to be evenly scheduled among the available resources. To realize this goal several load balancing strategies and algorithms have been proposed. Most strategies were developed in mind, assuming homogeneous set of sites linked with homogeneous and fast networks. However for computational grids we must address main new issues, namely: heterogeneity, scalability and adaptability. In this paper, we propose a layered algorithm which achieve dynamic load balancing in grid computing. Based on a tree model, our algorithm presents the following main features: (i) it is layered; (ii) it supports heterogeneity and scalability; and, (iii) it is totally independent from any physical architecture of a grid.
Abstract: It is sometimes difficult to differentiate between
innocent murmurs and pathological murmurs during auscultation. In
these difficult cases, an intelligent stethoscope with decision support
abilities would be of great value. In this study, using a dog model,
phonocardiographic recordings were obtained from 27 boxer dogs
with various degrees of aortic stenosis (AS) severity. As a reference
for severity assessment, continuous wave Doppler was used. The data
were analyzed with recurrence quantification analysis (RQA) with
the aim to find features able to distinguish innocent murmurs from
murmurs caused by AS. Four out of eight investigated RQA features
showed significant differences between innocent murmurs and
pathological murmurs. Using a plain linear discriminant analysis
classifier, the best pair of features (recurrence rate and entropy)
resulted in a sensitivity of 90% and a specificity of 88%. In
conclusion, RQA provide valid features which can be used for
differentiation between innocent murmurs and murmurs caused by
AS.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.