Artificial Neural Network based Web Application Firewall for SQL Injection

In recent years with the rapid development of Internet and the Web, more and more web applications have been deployed in many fields and organizations such as finance, military, and government. Together with that, hackers have found more subtle ways to attack web applications. According to international statistics, SQL Injection is one of the most popular vulnerabilities of web applications. The consequences of this type of attacks are quite dangerous, such as sensitive information could be stolen or authentication systems might be by-passed. To mitigate the situation, several techniques have been adopted. In this research, a security solution is proposed using Artificial Neural Network to protect web applications against this type of attacks. The solution has been experimented on sample datasets and has given promising result. The solution has also been developed in a prototypic web application firewall called ANNbWAF.

Comparison of Parametric and Nonparametric Techniques for Non-peak Traffic Forecasting

Accurately predicting non-peak traffic is crucial to daily traffic for all forecasting models. In the paper, least squares support vector machines (LS-SVMs) are investigated to solve such a practical problem. It is the first time to apply the approach and analyze the forecast performance in the domain. For comparison purpose, two parametric and two non-parametric techniques are selected because of their effectiveness proved in past research. Having good generalization ability and guaranteeing global minima, LS-SVMs perform better than the others. Providing sufficient improvement in stability and robustness reveals that the approach is practically promising.

Solving Facility Location Problem on Cluster Computing

Computation of facility location problem for every location in the country is not easy simultaneously. Solving the problem is described by using cluster computing. A technique is to design parallel algorithm by using local search with single swap method in order to solve that problem on clusters. Parallel implementation is done by the use of portable parallel programming, Message Passing Interface (MPI), on Microsoft Windows Compute Cluster. In this paper, it presents the algorithm that used local search with single swap method and implementation of the system of a facility to be opened by using MPI on cluster. If large datasets are considered, the process of calculating a reasonable cost for a facility becomes time consuming. The result shows parallel computation of facility location problem on cluster speedups and scales well as problem size increases.

Effects of Road Disturbance on Plant Biodiversity

Urbanization and related anthropogenic modifications cause extent of habitat fragmentation and directly lead to decline of local biodiversity. Conservation biologists advocate corridor creation as one approach to rescue biodiversity. Here we examine the utility of roads as corridors in preserving plant diversity by investigating roadside vegetation in Yellow River Delta (YRD), China. We examined the spatio-temporal distribution pattern of plant species richness, diversity and composition along roadside. The results suggest that roads, as dispersal conduits, increase occurrence probability of new settlers to a new area, meanwhile, roads accumulate the greater propagule pressure and favourable survival condition during operation phase. As a result, more species, including native and alien plants, non- halophyte and halophyte species, threatened and cosmopolitic species, were found prosperous at roadside. Roadside may be a refuge for more species, and the pattern of vegetation distribution is affected by road age and the distance from road verge.

Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

Autonomous Robots- Visual Perception in Underground Terrains Using Statistical Region Merging

Robots- visual perception is a field that is gaining increasing attention from researchers. This is partly due to emerging trends in the commercial availability of 3D scanning systems or devices that produce a high information accuracy level for a variety of applications. In the history of mining, the mortality rate of mine workers has been alarming and robots exhibit a great deal of potentials to tackle safety issues in mines. However, an effective vision system is crucial to safe autonomous navigation in underground terrains. This work investigates robots- perception in underground terrains (mines and tunnels) using statistical region merging (SRM) model. SRM reconstructs the main structural components of an imagery by a simple but effective statistical analysis. An investigation is conducted on different regions of the mine, such as the shaft, stope and gallery, using publicly available mine frames, with a stream of locally captured mine images. An investigation is also conducted on a stream of underground tunnel image frames, using the XBOX Kinect 3D sensors. The Kinect sensors produce streams of red, green and blue (RGB) and depth images of 640 x 480 resolution at 30 frames per second. Integrating the depth information to drivability gives a strong cue to the analysis, which detects 3D results augmenting drivable and non-drivable regions in 2D. The results of the 2D and 3D experiment with different terrains, mines and tunnels, together with the qualitative and quantitative evaluation, reveal that a good drivable region can be detected in dynamic underground terrains.

AJcFgraph - AspectJ Control Flow Graph Builder for Aspect-Oriented Software

The ever-growing usage of aspect-oriented development methodology in the field of software engineering requires tool support for both research environments and industry. So far, tool support for many activities in aspect-oriented software development has been proposed, to automate and facilitate their development. For instance, the AJaTS provides a transformation system to support aspect-oriented development and refactoring. In particular, it is well established that the abstract interpretation of programs, in any paradigm, pursued in static analysis is best served by a high-level programs representation, such as Control Flow Graph (CFG). This is why such analysis can more easily locate common programmatic idioms for which helpful transformation are already known as well as, association between the input program and intermediate representation can be more closely maintained. However, although the current researches define the good concepts and foundations, to some extent, for control flow analysis of aspectoriented programs but they do not provide a concrete tool that can solely construct the CFG of these programs. Furthermore, most of these works focus on addressing the other issues regarding Aspect- Oriented Software Development (AOSD) such as testing or data flow analysis rather than CFG itself. Therefore, this study is dedicated to build an aspect-oriented control flow graph construction tool called AJcFgraph Builder. The given tool can be applied in many software engineering tasks in the context of AOSD such as, software testing, software metrics, and so forth.

A New Dimension in Software Risk Managment

A dynamic risk management framework for software projects is presented. Currently available software risk management frameworks and risk assessment models are static in nature and lacks feedback capability. Such risk management frameworks are not capable of providing the risk assessment of futuristic changes in risk events. A dynamic risk management framework for software project is needed that provides futuristic assessment of risk events.

An ensemble of Weighted Support Vector Machines for Ordinal Regression

Instead of traditional (nominal) classification we investigate the subject of ordinal classification or ranking. An enhanced method based on an ensemble of Support Vector Machines (SVM-s) is proposed. Each binary classifier is trained with specific weights for each object in the training data set. Experiments on benchmark datasets and synthetic data indicate that the performance of our approach is comparable to state of the art kernel methods for ordinal regression. The ensemble method, which is straightforward to implement, provides a very good sensitivity-specificity trade-off for the highest and lowest rank.

Grid Based and Random Based Ant Colony Algorithms for Automatic Hose Routing in 3D Space

Ant Colony Algorithms have been applied to difficult combinatorial optimization problems such as the travelling salesman problem and the quadratic assignment problem. In this paper gridbased and random-based ant colony algorithms are proposed for automatic 3D hose routing and their pros and cons are discussed. The algorithm uses the tessellated format for the obstacles and the generated hoses in order to detect collisions. The representation of obstacles and hoses in the tessellated format greatly helps the algorithm towards handling free-form objects and speeds up computation. The performance of algorithm has been tested on a number of 3D models.

Empowering Communications Challenged users using Development Kits

The rapid pace of technological advancement and its consequential widening digital divide has resulted in the marginalization of the disabled especially the communication challenged. The dearth of suitable technologies for the development of assistive technologies has served to further marginalize the communications challenged user population and widen this chasm even further. Given the varying levels of disability there and its associated requirement for customized solution based. This paper explains the use of a Software Development Kits (SDK) for the bridging of this communications divide through the use of industry poplar communications SDKs towards identification of requirements for communications challenged users as well as identification of appropriate frameworks for future development initiatives.

Comparison of Conventional and “ECO“Transportation Pavements in Cyprus using Life Cycle Approach

Road industry has challenged the prospect of ecoconstruction. Pavements may fit within the framework of sustainable development. Hence, research implements assessments of conventional pavements impacts on environment in use of life cycle approach. To meet global, and often national, targets on pollution control, newly introduced pavement designs are under study. This is the case of Cyprus demonstration, which occurred within EcoLanes project work. This alternative pavement differs on concrete layer reinforced with tire recycling product. Processing of post-consumer tires produces steel fibers improving strength capacity against cracking. Thus maintenance works are relevantly limited in comparison to flexible pavement. This enables to be more ecofriendly, referenced to current study outputs. More specific, proposed concrete pavement life cycle processes emits 15 % less air pollutants and consumes 28 % less embodied energy than those of the asphalt pavement. In addition there is also a reduction on costs by 0.06 %.

Multiscale Analysis and Change Detection Based on a Contrario Approach

Automatic methods of detecting changes through satellite imaging are the object of growing interest, especially beca²use of numerous applications linked to analysis of the Earth’s surface or the environment (monitoring vegetation, updating maps, risk management, etc...). This work implemented spatial analysis techniques by using images with different spatial and spectral resolutions on different dates. The work was based on the principle of control charts in order to set the upper and lower limits beyond which a change would be noted. Later, the a contrario approach was used. This was done by testing different thresholds for which the difference calculated between two pixels was significant. Finally, labeled images were considered, giving a particularly low difference which meant that the number of “false changes” could be estimated according to a given limit.

Improvement of Synchronous Machine Dynamic Characteristics via Neural Network Based Controllers

This paper presents Simulation and experimental study aimed at investigating the effectiveness of an adaptive artificial neural network stabilizer on enhancing the damping torque of a synchronous generator. For this purpose, a power system comprising a synchronous generator feeding a large power system through a short tie line is considered. The proposed adaptive neuro-control system consists of two multi-layered feed forward neural networks, which work as a plant model identifier and a controller. It generates supplementary control signals to be utilized by conventional controllers. The details of the interfacing circuits, sensors and transducers, which have been designed and built for use in tests, are presented. The synchronous generator is tested to investigate the effect of tuning a Power System Stabilizer (PSS) on its dynamic stability. The obtained simulation and experimental results verify the basic theoretical concepts.

PCR based Detection of Food Borne Pathogens

Many high-risk pathogens that cause disease in humans are transmitted through various food items. Food-borne disease constitutes a major public health problem. Assessment of the quality and safety of foods is important in human health. Rapid and easy detection of pathogenic organisms will facilitate precautionary measures to maintain healthy food. The Polymerase Chain Reaction (PCR) is a handy tool for rapid detection of low numbers of bacteria. We have designed gene specific primers for most common food borne pathogens such as Staphylococci, Salmonella and E.coli. Bacteria were isolated from food samples of various food outlets and identified using gene specific PCRs. We identified Staphylococci, Salmonella and E.coli O157 using gene specific primers by rapid and direct PCR technique in various food samples. This study helps us in getting a complete picture of the various pathogens that threaten to cause and spread food borne diseases and it would also enable establishment of a routine procedure and methodology for rapid identification of food borne bacteria using the rapid technique of direct PCR. This study will also enable us to judge the efficiency of present food safety steps taken by food manufacturers and exporters.

Geometric Modeling of Illumination on the TFT-LCD Panel using Bezier Surface

In this paper, we propose a geometric modeling of illumination on the patterned image containing etching transistor. This image is captured by a commercial camera during the inspection of a TFT-LCD panel. Inspection of defect is an important process in the production of LCD panel, but the regional difference in brightness, which has a negative effect on the inspection, is due to the uneven illumination environment. In order to solve this problem, we present a geometric modeling of illumination consisting of an interpolation using the least squares method and 3D modeling using bezier surface. Our computational time, by using the sampling method, is shorter than the previous methods. Moreover, it can be further used to correct brightness in every patterned image.

An Exploratory Environment for Concurrency Control Algorithms

Designing, implementing, and debugging concurrency control algorithms in a real system is a complex, tedious, and errorprone process. Further, understanding concurrency control algorithms and distributed computations is itself a difficult task. Visualization can help with both of these problems. Thus, we have developed an exploratory environment in which people can prototype and test various versions of concurrency control algorithms, study and debug distributed computations, and view performance statistics of distributed systems. In this paper, we describe the exploratory environment and show how it can be used to explore concurrency control algorithms for the interactive steering of distributed computations.

Pathogen Removal Under the Influence of Iron

Drinking water is one of the most valuable resources available to mankind. The presence of pathogens in drinking water is highly undesirable. Because of the Lateritic soil, the iron concentrations were high in ground water. High concentration of iron and other trace elements could restrict bacterial growth and modify their metabolic pattern as well. The bacterial growth rate reduced in the presence of iron in water. This paper presents the results of a controlled laboratory study conducted to assess the inhibition of micro-organism (pathogen) in well waters in the presence of dissolved iron concentrations. Synthetic samples were studied in the laboratory and the results compared with field samples. Predictive model for microbial inhibition in the presence of iron is presented. It was seen that the bore wells, open wells and the field results varied, probably due to the nature of micro-organism utilizing the iron in well waters.

Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.