Abstract: This paper describes an automated implementable
system for impulsive signals detection and recognition. The system
uses a Digital Signal Processing device for the detection and
identification process. Here the system analyses the signals in real
time in order to produce a particular response if needed. The system
analyses the signals in real time in order to produce a specific output
if needed. Detection is achieved through normalizing the inputs and
comparing the read signals to a dynamic threshold and thus avoiding
detections linked to loud or fluctuating environing noise.
Identification is done through neuronal network algorithms. As a
setup our system can receive signals to “learn” certain patterns.
Through “learning” the system can recognize signals faster, inducing
flexibility to new patterns similar to those known. Sound is captured
through a simple jack input, and could be changed for an enhanced
recording surface such as a wide-area recorder. Furthermore a
communication module can be added to the apparatus to send alerts
to another interface if needed.
Abstract: The goal of a network-based intrusion detection
system is to classify activities of network traffics into two major
categories: normal and attack (intrusive) activities. Nowadays, data
mining and machine learning plays an important role in many
sciences; including intrusion detection system (IDS) using both
supervised and unsupervised techniques. However, one of the
essential steps of data mining is feature selection that helps in
improving the efficiency, performance and prediction rate of
proposed approach. This paper applies unsupervised K-means
clustering algorithm with information gain (IG) for feature selection
and reduction to build a network intrusion detection system. For our
experimental analysis, we have used the new NSL-KDD dataset,
which is a modified dataset for KDDCup 1999 intrusion detection
benchmark dataset. With a split of 60.0% for the training set and the
remainder for the testing set, a 2 class classifications have been
implemented (Normal, Attack). Weka framework which is a java
based open source software consists of a collection of machine
learning algorithms for data mining tasks has been used in the testing
process. The experimental results show that the proposed approach is
very accurate with low false positive rate and high true positive rate
and it takes less learning time in comparison with using the full
features of the dataset with the same algorithm.
Abstract: The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.
Abstract: Natural outdoor scene classification is active and
promising research area around the globe. In this study, the
classification is carried out in two phases. In the first phase, the
features are extracted from the images by wavelet decomposition
method and stored in a database as feature vectors. In the second
phase, the neural classifiers such as back-propagation neural network
(BPNN) and resilient back-propagation neural network (RPNN) are
employed for the classification of scenes. Four hundred color images
are considered from MIT database of two classes as forest and street.
A comparative study has been carried out on the performance of the
two neural classifiers BPNN and RPNN on the increasing number of
test samples. RPNN showed better classification results compared to
BPNN on the large test samples.
Abstract: This paper proposes a novel methodology for enabling
debugging and tracing of production web applications without
affecting its normal flow and functionality. This method of debugging
enables developers and maintenance engineers to replace a set of
existing resources such as images, server side scripts, cascading
style sheets with another set of resources per web session. The new
resources will only be active in the debug session and other sessions
will not be affected. This methodology will help developers in tracing
defects, especially those that appear only in production environments
and in exploring the behaviour of the system. A realization of the
proposed methodology has been implemented in Java.
Abstract: This paper describes an effective solution to the task
of a remote monitoring of super-extended objects (oil and gas
pipeline, railways, national frontier). The suggested solution is based
on the principle of simultaneously monitoring of seismoacoustic and
optical/infrared physical fields. The principle of simultaneous
monitoring of those fields is not new but in contrast to the known
solutions the suggested approach allows to control super-extended
objects with very limited operational costs. So-called C-OTDR
(Coherent Optical Time Domain Reflectometer) systems are used to
monitor the seismoacoustic field. Far-CCTV systems are used to
monitor the optical/infrared field. A simultaneous data processing
provided by both systems allows effectively detecting and classifying
target activities, which appear in the monitored objects vicinity. The
results of practical usage had shown high effectiveness of the
suggested approach.
Abstract: Face recognition is a technique to automatically
identify or verify individuals. It receives great attention in
identification, authentication, security and many more applications.
Diverse methods had been proposed for this purpose and also a lot of
comparative studies were performed. However, researchers could not
reach unified conclusion. In this paper, we are reporting an extensive
quantitative accuracy analysis of four most widely used face
recognition algorithms: Principal Component Analysis (PCA),
Independent Component Analysis (ICA), Linear Discriminant
Analysis (LDA) and Support Vector Machine (SVM) using AT&T,
Sheffield and Bangladeshi people face databases under diverse
situations such as illumination, alignment and pose variations.
Abstract: There is a real threat on the VIPs personal pages on
the Social Network Sites (SNS). The real threats to these pages is
violation of privacy and theft of identity through creating fake pages
that exploit their names and pictures to attract the victims and spread
of lies. In this paper, we propose a new secure architecture that
improves the trusting and finds an effective solution to reduce fake
pages and possibility of recognizing VIP pages on SNS. The
proposed architecture works as a third party that is added to
Facebook to provide the trust service to personal pages for VIPs.
Through this mechanism, it works to ensure the real identity of the
applicant through the electronic authentication of personal
information by storing this information within content of their
website. As a result, the significance of the proposed architecture is
that it secures and provides trust to the VIPs personal pages.
Furthermore, it can help to discover fake page, protect the privacy,
reduce crimes of personality-theft, and increase the sense of trust and
satisfaction by friends and admirers in interacting with SNS.
Abstract: SQL injection on web applications is a very popular
kind of attack. There are mechanisms such as intrusion detection
systems in order to detect this attack. These strategies often rely on
techniques implemented at high layers of the application but do not
consider the low level of system calls. The problem of only
considering the high level perspective is that an attacker can
circumvent the detection tools using certain techniques such as URL
encoding. One technique currently used for detecting low-level
attacks on privileged processes is the tracing of system calls. System
calls act as a single gate to the Operating System (OS) kernel; they
allow catching the critical data at an appropriate level of detail. Our
basic assumption is that any type of application, be it a system
service, utility program or Web application, “speaks” the language of
system calls when having a conversation with the OS kernel. At this
level we can see the actual attack while it is happening. We conduct
an experiment in order to demonstrate the suitability of system call
analysis for detecting SQL injection. We are able to detect the attack.
Therefore we conclude that system calls are not only powerful in
detecting low-level attacks but that they also enable us to detect highlevel
attacks such as SQL injection.
Abstract: An effort estimation model is needed for softwareintensive
projects that consist of hardware, embedded software or
some combination of the two, as well as high level software
solutions. This paper first focuses on functional decomposition
techniques to measure functional complexity of a computer system
and investigates its impact on system development effort. Later, it
examines effects of technical difficulty and design team capability
factors in order to construct the best effort estimation model. With
using traditional regression analysis technique, the study develops a
system development effort estimation model which takes functional
complexity, technical difficulty and design team capability factors as
input parameters. Finally, the assumptions of the model are tested.
Abstract: This paper uses the radial basis function neural
network (RBFNN) for system identification of nonlinear systems.
Five nonlinear systems are used to examine the activity of RBFNN in
system modeling of nonlinear systems; the five nonlinear systems are
dual tank system, single tank system, DC motor system, and two
academic models. The feed forward method is considered in this
work for modelling the non-linear dynamic models, where the KMeans
clustering algorithm used in this paper to select the centers of
radial basis function network, because it is reliable, offers fast
convergence and can handle large data sets. The least mean square
method is used to adjust the weights to the output layer, and
Euclidean distance method used to measure the width of the Gaussian
function.
Abstract: Cybercrime is now becoming a big challenge in Nigeria apart from the traditional crime. Inability to identify perpetrators is one of the reasons for the growing menace. This paper proposes a design for monitoring internet users’ activities in order to curbing cybercrime. It requires redefining the operations of Internet Service Providers (ISPs) which will now mandate users to be authenticated before accessing the internet. In implementing this work which can be adapted to a larger scale, a virtual router application is developed and configured to mimic a real router device. A sign-up portal is developed to allow users to register with the ISP. The portal asks for identification information which will include bio-data and government issued identification data like National Identity Card number, et cetera. A unique username and password are chosen by the user to enable access to the internet which will be used to reference him to an Internet Protocol Address (IP Address) of any system he uses on the internet and thereby associating him to any criminal act related to that IP address at that particular time. Questions such as “What happen when another user knows the password and uses it to commit crime?” and other pertinent issues are addressed.