Artificial Neural Network based Web Application Firewall for SQL Injection

In recent years with the rapid development of Internet and the Web, more and more web applications have been deployed in many fields and organizations such as finance, military, and government. Together with that, hackers have found more subtle ways to attack web applications. According to international statistics, SQL Injection is one of the most popular vulnerabilities of web applications. The consequences of this type of attacks are quite dangerous, such as sensitive information could be stolen or authentication systems might be by-passed. To mitigate the situation, several techniques have been adopted. In this research, a security solution is proposed using Artificial Neural Network to protect web applications against this type of attacks. The solution has been experimented on sample datasets and has given promising result. The solution has also been developed in a prototypic web application firewall called ANNbWAF.

Comparison of Parametric and Nonparametric Techniques for Non-peak Traffic Forecasting

Accurately predicting non-peak traffic is crucial to daily traffic for all forecasting models. In the paper, least squares support vector machines (LS-SVMs) are investigated to solve such a practical problem. It is the first time to apply the approach and analyze the forecast performance in the domain. For comparison purpose, two parametric and two non-parametric techniques are selected because of their effectiveness proved in past research. Having good generalization ability and guaranteeing global minima, LS-SVMs perform better than the others. Providing sufficient improvement in stability and robustness reveals that the approach is practically promising.

Solving Facility Location Problem on Cluster Computing

Computation of facility location problem for every location in the country is not easy simultaneously. Solving the problem is described by using cluster computing. A technique is to design parallel algorithm by using local search with single swap method in order to solve that problem on clusters. Parallel implementation is done by the use of portable parallel programming, Message Passing Interface (MPI), on Microsoft Windows Compute Cluster. In this paper, it presents the algorithm that used local search with single swap method and implementation of the system of a facility to be opened by using MPI on cluster. If large datasets are considered, the process of calculating a reasonable cost for a facility becomes time consuming. The result shows parallel computation of facility location problem on cluster speedups and scales well as problem size increases.

Effects of Road Disturbance on Plant Biodiversity

Urbanization and related anthropogenic modifications cause extent of habitat fragmentation and directly lead to decline of local biodiversity. Conservation biologists advocate corridor creation as one approach to rescue biodiversity. Here we examine the utility of roads as corridors in preserving plant diversity by investigating roadside vegetation in Yellow River Delta (YRD), China. We examined the spatio-temporal distribution pattern of plant species richness, diversity and composition along roadside. The results suggest that roads, as dispersal conduits, increase occurrence probability of new settlers to a new area, meanwhile, roads accumulate the greater propagule pressure and favourable survival condition during operation phase. As a result, more species, including native and alien plants, non- halophyte and halophyte species, threatened and cosmopolitic species, were found prosperous at roadside. Roadside may be a refuge for more species, and the pattern of vegetation distribution is affected by road age and the distance from road verge.

Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

Autonomous Robots- Visual Perception in Underground Terrains Using Statistical Region Merging

Robots- visual perception is a field that is gaining increasing attention from researchers. This is partly due to emerging trends in the commercial availability of 3D scanning systems or devices that produce a high information accuracy level for a variety of applications. In the history of mining, the mortality rate of mine workers has been alarming and robots exhibit a great deal of potentials to tackle safety issues in mines. However, an effective vision system is crucial to safe autonomous navigation in underground terrains. This work investigates robots- perception in underground terrains (mines and tunnels) using statistical region merging (SRM) model. SRM reconstructs the main structural components of an imagery by a simple but effective statistical analysis. An investigation is conducted on different regions of the mine, such as the shaft, stope and gallery, using publicly available mine frames, with a stream of locally captured mine images. An investigation is also conducted on a stream of underground tunnel image frames, using the XBOX Kinect 3D sensors. The Kinect sensors produce streams of red, green and blue (RGB) and depth images of 640 x 480 resolution at 30 frames per second. Integrating the depth information to drivability gives a strong cue to the analysis, which detects 3D results augmenting drivable and non-drivable regions in 2D. The results of the 2D and 3D experiment with different terrains, mines and tunnels, together with the qualitative and quantitative evaluation, reveal that a good drivable region can be detected in dynamic underground terrains.

AJcFgraph - AspectJ Control Flow Graph Builder for Aspect-Oriented Software

The ever-growing usage of aspect-oriented development methodology in the field of software engineering requires tool support for both research environments and industry. So far, tool support for many activities in aspect-oriented software development has been proposed, to automate and facilitate their development. For instance, the AJaTS provides a transformation system to support aspect-oriented development and refactoring. In particular, it is well established that the abstract interpretation of programs, in any paradigm, pursued in static analysis is best served by a high-level programs representation, such as Control Flow Graph (CFG). This is why such analysis can more easily locate common programmatic idioms for which helpful transformation are already known as well as, association between the input program and intermediate representation can be more closely maintained. However, although the current researches define the good concepts and foundations, to some extent, for control flow analysis of aspectoriented programs but they do not provide a concrete tool that can solely construct the CFG of these programs. Furthermore, most of these works focus on addressing the other issues regarding Aspect- Oriented Software Development (AOSD) such as testing or data flow analysis rather than CFG itself. Therefore, this study is dedicated to build an aspect-oriented control flow graph construction tool called AJcFgraph Builder. The given tool can be applied in many software engineering tasks in the context of AOSD such as, software testing, software metrics, and so forth.

Numerical Investigation of the Thermal Separation in a Vortex Tube

This work has been carried out in order to provide an understanding of the physical behaviors of the flow variation of pressure and temperature in a vortex tube. A computational fluid dynamics model is used to predict the flow fields and the associated temperature separation within a Ranque–Hilsch vortex tube. The CFD model is a steady axisymmetric model (with swirl) that utilizes the standard k-ε turbulence model. The second–order numerical schemes, was used to carry out all the computations. Vortex tube with a circumferential inlet stream and an axial (cold) outlet stream and a circumferential (hot) outlet stream was considered. Performance curves (temperature separation versus cold outlet mass fraction) were obtained for a specific vortex tube with a given inlet mass flow rate. Simulations have been carried out for varying amounts of cold outlet mass flow rates. The model results have a good agreement with experimental data.

Empowering Communications Challenged users using Development Kits

The rapid pace of technological advancement and its consequential widening digital divide has resulted in the marginalization of the disabled especially the communication challenged. The dearth of suitable technologies for the development of assistive technologies has served to further marginalize the communications challenged user population and widen this chasm even further. Given the varying levels of disability there and its associated requirement for customized solution based. This paper explains the use of a Software Development Kits (SDK) for the bridging of this communications divide through the use of industry poplar communications SDKs towards identification of requirements for communications challenged users as well as identification of appropriate frameworks for future development initiatives.

Multiscale Analysis and Change Detection Based on a Contrario Approach

Automatic methods of detecting changes through satellite imaging are the object of growing interest, especially beca²use of numerous applications linked to analysis of the Earth’s surface or the environment (monitoring vegetation, updating maps, risk management, etc...). This work implemented spatial analysis techniques by using images with different spatial and spectral resolutions on different dates. The work was based on the principle of control charts in order to set the upper and lower limits beyond which a change would be noted. Later, the a contrario approach was used. This was done by testing different thresholds for which the difference calculated between two pixels was significant. Finally, labeled images were considered, giving a particularly low difference which meant that the number of “false changes” could be estimated according to a given limit.

Improvement of Synchronous Machine Dynamic Characteristics via Neural Network Based Controllers

This paper presents Simulation and experimental study aimed at investigating the effectiveness of an adaptive artificial neural network stabilizer on enhancing the damping torque of a synchronous generator. For this purpose, a power system comprising a synchronous generator feeding a large power system through a short tie line is considered. The proposed adaptive neuro-control system consists of two multi-layered feed forward neural networks, which work as a plant model identifier and a controller. It generates supplementary control signals to be utilized by conventional controllers. The details of the interfacing circuits, sensors and transducers, which have been designed and built for use in tests, are presented. The synchronous generator is tested to investigate the effect of tuning a Power System Stabilizer (PSS) on its dynamic stability. The obtained simulation and experimental results verify the basic theoretical concepts.

Role of Credit on Production Efficiency of Farming Sector in Pakistan(A Data Envelopment Analysis)

The study identified the sources of production inefficiency of the farming sector in district Faisalabad in the Punjab province of Pakistan. Data Envelopment Analysis (DEA) technique was utilized at farm level survey data of 300 farmers for the year 2009. The overall mean efficiency score was 0.78 indicating 22 percent inefficiency of the sample farmers. Computed efficiency scores were then regressed on farm specific variables using Tobit regression analysis. Farming experience, education, access to farming credit, herd size and number of cultivation practices showed constructive and significant effect on the farmer-s technical efficiency.

Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Simulation and 40 Years of Object-Oriented Programming

2007 is a jubilee year: in 1967, programming language SIMULA 67 was presented, which contained all aspects of what was later called object-oriented programming. The present paper contains a description of the development unto the objectoriented programming, the role of simulation in this development and other tools that appeared in SIMULA 67 and that are nowadays called super-object-oriented programming.

Dose due the Incorporation of Radionuclides Using Teeth as Bioindicators nearby Caetité Uranium Mines

Uranium mining and processing in Brazil occur in a northeastern area near to Caetité-BA. Several Non-Governmental Organizations claim that uranium mining in this region is a pollutant causing health risks to the local population,but those in charge of the complex extraction and production of“yellow cake" for generating fuel to the nuclear power plants reject these allegations. This study aimed at identifying potential problems caused by mining to the population of Caetité. In this, work,the concentrations of 238U, 232Th and 40K radioisotopes in the teeth of the Caetité population were determined by ICP-MS. Teeth are used as bioindicators of incorporated radionuclides. Cumulative radiation doses in the skeleton were also determined. The concentration values were below 0.008 ppm, and annual effective dose due to radioisotopes are below to the reference values. Therefore, it is not possible to state that the mining process in Caetité increases pollution or radiation exposure in a meaningful way.

Heat and Mass Transfer in a Solar Dryer with Biomass Backup Burner

Majority of pepper farmers in Malaysia are using the open-sun method for drying the pepper berries. This method is time consuming and exposed the berries to rain and contamination. A maintenance-friendly and properly enclosed dryer is therefore desired. A dryer design with a solar collector and a chimney was studied and adapted to suit the needs of small-scale pepper farmers in Malaysia. The dryer will provide an environment with an optimum operating temperature meant for drying pepper berries. The dryer model was evaluated by using commercially available computational fluid dynamic (CFD) software in order to understand the heat and mass transfer inside the dryer. Natural convection was the only mode of heat transportation considered in this study as in accordance to the idea of having a simple and maintenance-friendly design. To accommodate the effect of low buoyancy found in natural convection driers, a biomass burner was integrated into the solar dryer design.

A Model of Market Segmentation for the Customers of Mellat Bank in Iran

If organizations like Mellat Bank want to identify its customer market completely to reach its specified goals, it can segment the market to offer the product package to the right segment. Our objective is to offer a segmentation model for Iran banking market in Mellat bank view. The methodology of this project is combined by “segmentation on the basis of four part-quality variables" and “segmentation on the basis of different in means". Required data are gathered from E-Systems and researcher personal observation. Finally, the research offers the organization that at first step form a four dimensional matrix with 756 segments using four variables named value-based, behavioral, activity style, and activity level, and at the second step calculate the means of profit for every cell of matrix in two distinguished work level (levels α1:normal condition and α2: high pressure condition) and compare the segments by checking two conditions that are 1- homogeneity every segment with its sub segment and 2- heterogeneity with other segments, and so it can do the necessary segmentation process. After all, the last offer (more explained by an operational example and feedback algorithm) is to test and update the model because of dynamic environment, technology, and banking system.

Data Mining Classification Methods Applied in Drug Design

Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.

A New Self-Adaptive EP Approach for ANN Weights Training

Evolutionary Programming (EP) represents a methodology of Evolutionary Algorithms (EA) in which mutation is considered as a main reproduction operator. This paper presents a novel EP approach for Artificial Neural Networks (ANN) learning. The proposed strategy consists of two components: the self-adaptive, which contains phenotype information and the dynamic, which is described by genotype. Self-adaptation is achieved by the addition of a value, called the network weight, which depends on a total number of hidden layers and an average number of neurons in hidden layers. The dynamic component changes its value depending on the fitness of a chromosome, exposed to mutation. Thus, the mutation step size is controlled by two components, encapsulated in the algorithm, which adjust it according to the characteristics of a predefined ANN architecture and the fitness of a particular chromosome. The comparative analysis of the proposed approach and the classical EP (Gaussian mutation) showed, that that the significant acceleration of the evolution process is achieved by using both phenotype and genotype information in the mutation strategy.