Abstract: In recent years with the rapid development of Internet and the Web, more and more web applications have been deployed in many fields and organizations such as finance, military, and government. Together with that, hackers have found more subtle ways to attack web applications. According to international statistics, SQL Injection is one of the most popular vulnerabilities of web applications. The consequences of this type of attacks are quite dangerous, such as sensitive information could be stolen or authentication systems might be by-passed. To mitigate the situation, several techniques have been adopted. In this research, a security solution is proposed using Artificial Neural Network to protect web applications against this type of attacks. The solution has been experimented on sample datasets and has given promising result. The solution has also been developed in a prototypic web application firewall called ANNbWAF.
Abstract: Accurately predicting non-peak traffic is crucial to
daily traffic for all forecasting models. In the paper, least squares
support vector machines (LS-SVMs) are investigated to solve such a
practical problem. It is the first time to apply the approach and analyze
the forecast performance in the domain. For comparison purpose, two
parametric and two non-parametric techniques are selected because of
their effectiveness proved in past research. Having good
generalization ability and guaranteeing global minima, LS-SVMs
perform better than the others. Providing sufficient improvement in
stability and robustness reveals that the approach is practically
promising.
Abstract: Computation of facility location problem for every
location in the country is not easy simultaneously. Solving the
problem is described by using cluster computing. A technique is to
design parallel algorithm by using local search with single swap
method in order to solve that problem on clusters. Parallel
implementation is done by the use of portable parallel programming,
Message Passing Interface (MPI), on Microsoft Windows Compute
Cluster. In this paper, it presents the algorithm that used local search
with single swap method and implementation of the system of a
facility to be opened by using MPI on cluster. If large datasets are
considered, the process of calculating a reasonable cost for a facility
becomes time consuming. The result shows parallel computation of
facility location problem on cluster speedups and scales well as
problem size increases.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Many high-risk pathogens that cause disease in
humans are transmitted through various food items. Food-borne
disease constitutes a major public health problem. Assessment of the
quality and safety of foods is important in human health. Rapid and
easy detection of pathogenic organisms will facilitate precautionary
measures to maintain healthy food. The Polymerase Chain Reaction
(PCR) is a handy tool for rapid detection of low numbers of bacteria.
We have designed gene specific primers for most common food
borne pathogens such as Staphylococci, Salmonella and E.coli.
Bacteria were isolated from food samples of various food outlets and
identified using gene specific PCRs. We identified Staphylococci,
Salmonella and E.coli O157 using gene specific primers by rapid and
direct PCR technique in various food samples. This study helps us in
getting a complete picture of the various pathogens that threaten to
cause and spread food borne diseases and it would also enable
establishment of a routine procedure and methodology for rapid
identification of food borne bacteria using the rapid technique of
direct PCR. This study will also enable us to judge the efficiency of
present food safety steps taken by food manufacturers and exporters.
Abstract: The study identified the sources of production
inefficiency of the farming sector in district Faisalabad in the Punjab
province of Pakistan. Data Envelopment Analysis (DEA) technique
was utilized at farm level survey data of 300 farmers for the year
2009. The overall mean efficiency score was 0.78 indicating 22
percent inefficiency of the sample farmers. Computed efficiency
scores were then regressed on farm specific variables using Tobit
regression analysis. Farming experience, education, access to
farming credit, herd size and number of cultivation practices showed
constructive and significant effect on the farmer-s technical
efficiency.
Abstract: The extensive number of engineering drawing will be referred for planning process and the changes will produce a good engineering design to meet the demand in producing a new model. The advantage in reuse of engineering designs is to allow continuous product development to further improve the quality of product development, thus reduce the development costs. However, to retrieve the existing engineering drawing, it is time consuming, a complex process and are expose to errors. Engineering drawing file searching system will be proposed to solve this problem. It is essential for engineer and designer to have some sort of medium to enable them to search for drawing in the most effective way. This paper lays out the proposed research project under the area of information extraction in engineering drawing.
Abstract: A recent neurospiking coding scheme for feature extraction from biosonar echoes of various plants is examined with avariety of stochastic classifiers. Feature vectors derived are employedin well-known stochastic classifiers, including nearest-neighborhood,single Gaussian and a Gaussian mixture with EM optimization.Classifiers' performances are evaluated by using cross-validation and bootstrapping techniques. It is shown that the various classifers perform equivalently and that the modified preprocessing configuration yields considerably improved results.
Abstract: Although oil-based drilling fluids are of paramount practical and economical interest, they represent a serious source of pollution, once released into the environment as drill cuttings. The aim of this study is to assess the capability of isolated microorganisms to degrade gasoil fuel. The commonly used physicochemical and biodegradation remediation techniques of petroleum contaminated soil were both investigated. The study revealed that natural biodegradation is favorable. Even though, the presence of heavy metals, the moisture level of (8.55%) and nutrient deficiencies put severe constrains on microorganisms- survival ranges inhibiting the biodegradation process. The selected strains were able to degrade the diesel fuel at significantly high rates (around 98%).
Abstract: Diffuse viral encephalitis may lack fever and other cardinal signs of infection and hence its distinction from other acute encephalopathic illnesses is challenging. Often, the EEG changes seen routinely are nonspecific and reflect diffuse encephalopathic changes only. The aim of this study was to use nonlinear dynamic mathematical techniques for analyzing the EEG data in order to look for any characteristic diagnostic patterns in diffuse forms of encephalitis.It was diagnosed on clinical, imaging and cerebrospinal fluid criteria in three young male patients. Metabolic and toxic encephalopathies were ruled out through appropriate investigations. Digital EEGs were done on the 3rd to 5th day of onset. The digital EEGs of 5 male and 5 female age and sex matched healthy volunteers served as controls.Two sample t-test indicated that there was no statistically significant difference between the average values in amplitude between the two groups. However, the standard deviation (or variance) of the EEG signals at FP1-F7 and FP2-F8 are significantly higher for the patients than the normal subjects. The regularisation dimension is significantly less for the patients (average between 1.24-1.43) when compared to the normal persons (average between 1.41-1.63) for the EEG signals from all locations except for the Fz-Cz signal. Similarly the wavelet dimension is significantly less (P = 0.05*) for the patients (1.122) when compared to the normal person (1.458). EEGs are subdued in the case of the patients with presence of uniform patterns, manifested in the values of regularisation and wavelet dimensions, when compared to the normal person, indicating a decrease in chaotic nature.
Abstract: Terminal localization for indoor Wireless Local Area
Networks (WLANs) is critical for the deployment of location-aware
computing inside of buildings. A major challenge is obtaining high
localization accuracy in presence of fluctuations of the received signal
strength (RSS) measurements caused by multipath fading. This paper
focuses on reducing the effect of the distance-varying noise by spatial
filtering of the measured RSS. Two different survey point geometries
are tested with the noise reduction technique: survey points arranged
in sets of clusters and survey points uniformly distributed over the
network area. The results show that the location accuracy improves
by 16% when the filter is used and by 18% when the filter is applied
to a clustered survey set as opposed to a straight-line survey set.
The estimated locations are within 2 m of the true location, which
indicates that clustering the survey points provides better localization
accuracy due to superior noise removal.
Abstract: Accurate loss minimization is the critical component
for efficient electrical distribution power flow .The contribution of
this work presents loss minimization in power distribution system
through feeder restructuring, incorporating DG and placement of
capacitor. The study of this work was conducted on IEEE
distribution network and India Electricity Board benchmark
distribution system. The executed experimental result of Indian
system is recommended to board and implement practically for
regulated stable output.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: This paper presents an application of 5S lean technology to a production facility. Due to increased demand, high product variety, and a push production system, the plant has suffered from excessive wastes, unorganized workstations, and unhealthy work environment. This has translated into increased production cost, frequent delays, and low workers morale. Under such conditions, it has become difficult, if not impossible, to implement effective continuous improvement studies. Hence, the lean project is aimed at diagnosing the production process, streamlining the workflow, removing/reducing process waste, cleaning the production environment, improving plant layout, and organizing workstations. 5S lean technology is utilized for achieving project objectives. The work was a combination of both culture changes and tangible/physical changes on the shop floor. The project has drastically changed the plant and developed the infrastructure for a successful implementation of continuous improvement as well as other best practices and quality initiatives.
Abstract: Tread design has evolved over the years to achieve the common tread pattern used in current vehicles. However, to meet safety and comfort requirements, tread design considers more than one design factor. Tread design must consider the grip and drainage, and the manner in which to reduce rolling noise, which is one of the main factors considered by manufacturers. The main objective of this study was the application the computational fluid dynamics (CFD) technique to simulate the contact surface of the tire and ground. The results demonstrated an air-Pumping and large pressure drop effect in the process of contact surface. The results also revealed that the pressure can be used to analyze sound pressure level (SPL).
Abstract: This paper illustrates the use of a combined neural
network model for classification of electrocardiogram (ECG) beats.
We present a trainable neural network ensemble approach to develop
customized electrocardiogram beat classifier in an effort to further
improve the performance of ECG processing and to offer
individualized health care.
We process a three stage technique for detection of premature
ventricular contraction (PVC) from normal beats and other heart
diseases. This method includes a denoising, a feature extraction and a
classification. At first we investigate the application of stationary
wavelet transform (SWT) for noise reduction of the
electrocardiogram (ECG) signals. Then feature extraction module
extracts 10 ECG morphological features and one timing interval
feature. Then a number of multilayer perceptrons (MLPs) neural
networks with different topologies are designed.
The performance of the different combination methods as well as
the efficiency of the whole system is presented. Among them,
Stacked Generalization as a proposed trainable combined neural
network model possesses the highest recognition rate of around 95%.
Therefore, this network proves to be a suitable candidate in ECG
signal diagnosis systems. ECG samples attributing to the different
ECG beat types were extracted from the MIT-BIH arrhythmia
database for the study.
Abstract: This paper describes a complex energy signal model
that is isomorphic with digital human fingerprint images. By using
signal models, the problem of fingerprint matching is transformed
into the signal processing problem of finding a correlation between
two complex signals that differ by phase-rotation and time-scaling. A
technique for minutiae matching that is independent of image
translation, rotation and linear-scaling, and is resistant to missing
minutiae is proposed. The method was tested using random data
points. The results show that for matching prints the scaling and
rotation angles are closely estimated and a stronger match will have a
higher correlation.
Abstract: Image-based Rendering(IBR) techniques recently
reached in broad fields which leads to a critical challenge to build up
IBR-Driven visualization platform where meets requirement of high
performance, large bounds of distributed visualization resource
aggregation and concentration, multiple operators deploying and
CSCW design employing. This paper presents an unique IBR-based
visualization dataflow model refer to specific characters of IBR
techniques and then discusses prominent feature of IBR-Driven
distributed collaborative visualization (DCV) system before finally
proposing an novel prototype. The prototype provides a well-defined
three level modules especially work as Central Visualization Server,
Local Proxy Server and Visualization Aid Environment, by which
data and control for collaboration move through them followed the
previous dataflow model. With aid of this triple hierarchy architecture
of that, IBR oriented application construction turns to be easy. The
employed augmented collaboration strategy not only achieve
convenient multiple users synchronous control and stable processing
management, but also is extendable and scalable.
Abstract: The clustering ensembles combine multiple partitions
generated by different clustering algorithms into a single clustering
solution. Clustering ensembles have emerged as a prominent method
for improving robustness, stability and accuracy of unsupervised
classification solutions. So far, many contributions have been done to
find consensus clustering. One of the major problems in clustering
ensembles is the consensus function. In this paper, firstly, we
introduce clustering ensembles, representation of multiple partitions,
its challenges and present taxonomy of combination algorithms.
Secondly, we describe consensus functions in clustering ensembles
including Hypergraph partitioning, Voting approach, Mutual
information, Co-association based functions and Finite mixture
model, and next explain their advantages, disadvantages and
computational complexity. Finally, we compare the characteristics of
clustering ensembles algorithms such as computational complexity,
robustness, simplicity and accuracy on different datasets in previous
techniques.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).