Abstract: One main drawback of intrusion detection system is the
inability of detecting new attacks which do not have known
signatures. In this paper we discuss an intrusion detection method
that proposes independent component analysis (ICA) based feature
selection heuristics and using rough fuzzy for clustering data. ICA is
to separate these independent components (ICs) from the monitored
variables. Rough set has to decrease the amount of data and get rid of
redundancy and Fuzzy methods allow objects to belong to several
clusters simultaneously, with different degrees of membership. Our
approach allows us to recognize not only known attacks but also to
detect activity that may be the result of a new, unknown attack. The
experimental results on Knowledge Discovery and Data Mining-
(KDDCup 1999) dataset.
Abstract: The main problem for recognition of handwritten Persian digits using Neural Network is to extract an appropriate feature vector from image matrix. In this research an asymmetrical segmentation pattern is proposed to obtain the feature vector. This pattern can be adjusted as an optimum model thanks to its one degree of freedom as a control point. Since any chosen algorithm depends on digit identity, a Neural Network is used to prevail over this dependence. Inputs of this Network are the moment of inertia and the center of gravity which do not depend on digit identity. Recognizing the digit is carried out using another Neural Network. Simulation results indicate the high recognition rate of 97.6% for new introduced pattern in comparison to the previous models for recognition of digits.
Abstract: Feature selection plays an important role in applications with high dimensional data. The assessment of the stability of feature selection/ranking algorithms becomes an important issue when the dataset is small and the aim is to gain insight into the underlying process by analyzing the most relevant features. In this work, we propose a graphical approach that enables to analyze the similarity between feature ranking techniques as well as their individual stability. Moreover, it works with whatever stability metric (Canberra distance, Spearman's rank correlation coefficient, Kuncheva's stability index,...). We illustrate this visualization technique evaluating the stability of several feature selection techniques on a spectral binary dataset. Experimental results with a neural-based classifier show that stability and ranking quality may not be linked together and both issues have to be studied jointly in order to offer answers to the domain experts.
Abstract: Virtualization and high performance computing have been discussed from a performance perspective in recent publications. We present and discuss a flexible and efficient approach to the management of virtual clusters. A virtual machine management tool is extended to function as a fabric for cluster deployment and management. We show how features such as saving the state of a running cluster can be used to avoid disruption. We also compare our approach to the traditional methods of cluster deployment and present benchmarks which illustrate the efficiency of our approach.
Abstract: Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.
Abstract: Prior research evidenced that unimodal biometric
systems have several tradeoffs like noisy data, intra-class variations,
restricted degrees of freedom, non-universality, spoof attacks, and
unacceptable error rates. In order for the biometric system to be more
secure and to provide high performance accuracy, more than one
form of biometrics are required. Hence, the need arise for multimodal
biometrics using combinations of different biometric modalities. This
paper introduces a multimodal biometric system (MMBS) based on
fusion of whole dorsal hand geometry and fingerprints that acquires
right and left (Rt/Lt) near-infra-red (NIR) dorsal hand geometry (HG)
shape and (Rt/Lt) index and ring fingerprints (FP). Database of 100
volunteers were acquired using the designed prototype. The acquired
images were found to have good quality for all features and patterns
extraction to all modalities. HG features based on the hand shape
anatomical landmarks were extracted. Robust and fast algorithms for
FP minutia points feature extraction and matching were used. Feature
vectors that belong to similar biometric traits were fused using
feature fusion methodologies. Scores obtained from different
biometric trait matchers were fused using the Min-Max
transformation-based score fusion technique. Final normalized scores
were merged using the sum of scores method to obtain a single
decision about the personal identity based on multiple independent
sources. High individuality of the fused traits and user acceptability
of the designed system along with its experimental high performance
biometric measures showed that this MMBS can be considered for
med-high security levels biometric identification purposes.
Abstract: In this paper, a novel road extraction method using Stationary Wavelet Transform is proposed. To detect road features from color aerial satellite imagery, Mexican hat Wavelet filters are used by applying the Stationary Wavelet Transform in a multiresolution, multi-scale, sense and forming the products of Wavelet coefficients at a different scales to locate and identify road features at a few scales. In addition, the shifting of road features locations is considered through multiple scales for robust road extraction in the asymmetry road feature profiles. From the experimental results, the proposed method leads to a useful technique to form the basis of road feature extraction. Also, the method is general and can be applied to other features in imagery.
Abstract: The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.
Abstract: Purpose of this work is the development of an
automatic classification system which could be useful for radiologists
in the investigation of breast cancer. The software has been designed
in the framework of the MAGIC-5 collaboration.
In the automatic classification system the suspicious regions with
high probability to include a lesion are extracted from the image as
regions of interest (ROIs). Each ROI is characterized by some
features based on morphological lesion differences.
Some classifiers as a Feed Forward Neural Network, a K-Nearest
Neighbours and a Support Vector Machine are used to distinguish the
pathological records from the healthy ones.
The results obtained in terms of sensitivity (percentage of
pathological ROIs correctly classified) and specificity (percentage of
non-pathological ROIs correctly classified) will be presented through
the Receive Operating Characteristic curve (ROC). In particular the
best performances are 88% ± 1 of area under ROC curve obtained
with the Feed Forward Neural Network.
Abstract: Value engineering is an efficacious contraption for
administrators to make up their minds. Value perusals proffer the
gaffers a suitable instrument to decrease the expenditures of the life
span, quality amelioration, structural improvement, curtailment of the
construction schedule, longevity prolongation or a merging of the
aforementioned cases. Subjecting organizers to pressures on one
hand and their accountability towards their pertinent fields together
with inherent risks and ambiguities of other options on the other hand
set some comptrollers in a dilemma utilization of risk management
and the value engineering in projects manipulation with regard to
complexities of implementing projects can be wielded as a
contraption to identify and efface each item which wreaks
unnecessary expenses and time squandering sans inflicting any
damages upon the essential project applications. Of course It should
be noted that implementation of risk management and value
engineering with regard to the betterment of efficiency and functions
may lead to the project implementation timing elongation. Here time
revamping does not refer to time diminishing in the whole cases. his
article deals with risk and value engineering conceptualizations at
first. The germane reverberations effectuated due to its execution in
Iran Khodro Corporation are regarded together with the joint features
and amalgamation of the aforesaid entia; hence the proposed
blueprint is submitted to be taken advantage of in engineering and
industrial projects including Iran Khodro Corporation.
Abstract: Today-s children, who are born into a more colorful,
more creative, more abstract and more accessible communication
environment than their ancestors as a result of dizzying advances in
technology, have an interesting capacity to perceive and make sense
of the world. Millennium children, who live in an environment where
all kinds of efforts by marketing communication are more intensive
than ever are, from their early childhood on, subject to all kinds of
persuasive messages. As regards advertising communication, it
outperforms all the other marketing communication efforts in
creating little consumer individuals and, as a result of processing of
codes and signs, plays a significant part in building a world of seeing,
thinking and understanding for children. Children who are raised with
metaphorical expressions such as tales and riddles also meet that fast
and effective meaning communication in advertisements.
Children-s perception of metaphors, which help grasp the “product
and its promise" both verbally and visually and facilitate association
between them is the subject of this study. Stimulating and activating
imagination, metaphors have unique advantages in promoting the
product and its promise especially in regard to print advertisements,
which have certain limitations. This study deals comparatively with
both literal and metaphoric versions of print advertisements
belonging to various product groups and attempts to discover to what
extent advertisements are liked, recalled, perceived and are
persuasive. The sample group of the study, which was conducted in
two elementary schools situated in areas that had different socioeconomic
features, consisted of children aged 12.
Abstract: The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.
Abstract: A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.
Abstract: IPsec has now become a standard information security
technology throughout the Internet society. It provides a well-defined
architecture that takes into account confidentiality, authentication,
integrity, secure key exchange and protection mechanism against
replay attack also. For the connectionless security services on packet
basis, IETF IPsec Working Group has standardized two extension
headers (AH&ESP), key exchange and authentication protocols. It is
also working on lightweight key exchange protocol and MIB's for
security management. IPsec technology has been implemented on
various platforms in IPv4 and IPv6, gradually replacing old
application-specific security mechanisms. IPv4 and IPv6 are not
directly compatible, so programs and systems designed to one
standard can not communicate with those designed to the other. We
propose the design and implementation of controlled Internet security
system, which is IPsec-based Internet information security system in
IPv4/IPv6 network and also we show the data of performance
measurement. With the features like improved scalability and
routing, security, ease-of-configuration, and higher performance of
IPv6, the controlled Internet security system provides consistent
security policy and integrated security management on IPsec-based
Internet security system.
Abstract: Recently, permeable breakwaters have been suggested to overcome the disadvantages of fully protection breakwaters. These protection structures have minor impacts on the coastal environment and neighboring beaches where they provide a more economical protection from waves and currents. For regular waves, a numerical model is used (FLOW-3D, VOF) to investigate the hydraulic performance of a permeable breakwater. The model of permeable breakwater consists of a pair of identical vertical slotted walls with an impermeable upper and lower part, where the draft is a decimal multiple of the total depth. The middle part is permeable with a porosity of 50%. The second barrier is located at distant of 0.5 and 1.5 of the water depth from the first one. The numerical model is validated by comparisons with previous laboratory data and semi-analytical results of the same model. A good agreement between the numerical results and both laboratory data and semi-analytical results has been shown and the results indicate the applicability of the numerical model to reproduce most of the important features of the interaction. Through the numerical investigation, the friction factor of the model is carefully discussed.
Abstract: In this paper, Wavelet based ANFIS for finding inter
turn fault of generator is proposed. The detector uniquely responds to
the winding inter turn fault with remarkably high sensitivity.
Discrimination of different percentage of winding affected by inter
turn fault is provided via ANFIS having an Eight dimensional input
vector. This input vector is obtained from features extracted from
DWT of inter turn faulty current leaving the generator phase
winding. Training data for ANFIS are generated via a simulation of
generator with inter turn fault using MATLAB. The proposed
algorithm using ANFIS is giving satisfied performance than ANN
with selected statistical data of decomposed levels of faulty current.
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.
Abstract: A method of collecting composition data and examining structural features of pearlite lamellae and the parent austenite at the growth interface in a 13wt. % manganese steel has been demonstrated with the use of Scanning Transmission Electron Microscopy (STEM). The combination of composition data and the structural features observed at the growth interface show that available theories of pearlite growth cannot explain all the observations.
Abstract: A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.
Abstract: Iris-based biometric authentication is gaining importance
in recent times. Iris biometric processing however, is a complex
process and computationally very expensive. In the overall processing
of iris biometric in an iris-based biometric authentication system,
feature processing is an important task. In feature processing, we extract
iris features, which are ultimately used in matching. Since there
is a large number of iris features and computational time increases
as the number of features increases, it is therefore a challenge to
develop an iris processing system with as few as possible number of
features and at the same time without compromising the correctness.
In this paper, we address this issue and present an approach to feature
extraction and feature matching process. We apply Daubechies D4
wavelet with 4 levels to extract features from iris images. These
features are encoded with 2 bits by quantizing into 4 quantization
levels. With our proposed approach it is possible to represent an
iris template with only 304 bits, whereas existing approaches require
as many as 1024 bits. In addition, we assign different weights to
different iris region to compare two iris templates which significantly
increases the accuracy. Further, we match the iris template based on
a weighted similarity measure. Experimental results on several iris
databases substantiate the efficacy of our approach.