Identify Features and Parameters to Devise an Accurate Intrusion Detection System Using Artificial Neural Network

The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.

Region Segmentation based on Gaussian Dirichlet Process Mixture Model and its Application to 3D Geometric Stricture Detection

In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.

The Fundamental Reliance of Iterative Learning Control on Stability Robustness

Iterative learning control aims to achieve zero tracking error of a specific command. This is accomplished by iteratively adjusting the command given to a feedback control system, based on the tracking error observed in the previous iteration. One would like the iterations to converge to zero tracking error in spite of any error present in the model used to design the learning law. First, this need for stability robustness is discussed, and then the need for robustness of the property that the transients are well behaved. Methods of producing the needed robustness to parameter variations and to singular perturbations are presented. Then a method involving reverse time runs is given that lets the world behavior produce the ILC gains in such a way as to eliminate the need for a mathematical model. Since the real world is producing the gains, there is no issue of model error. Provided the world behaves linearly, the approach gives an ILC law with both stability robustness and good transient robustness, without the need to generate a model.

Vibration Damping of High-Chromium Ferromagnetic Steel

The aim of the present work is to study the effect of annealing on the vibration damping capacity of high-chromium (16%) ferromagnetic steel. The alloys were prepared from raw materials of 99.9% purity melted in a high frequency induction furnace under high vacuum. The samples were heat-treated in vacuum at various temperatures (800 to 1200ºC) for 1 hour followed by slow cooling (120ºC/h). The inverted torsional pendulum method was used to evaluate the vibration damping capacity. The results indicated that the vibration damping capacity of the alloys is influenced by annealing and there exists a critical annealing temperature after 1000ºC. The damping capacity increases quickly below the critical temperature since the magnetic domains move more easily.

Automated ECG Segmentation Using Piecewise Derivative Dynamic Time Warping

Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.

Numerical Simulation of the Transient Shape Variation of a Rotating Liquid Droplet

Transient shape variation of a rotating liquid dropletis simulated numerically. The three dimensional Navier-Stokes equations were solved by using the level set method. The shape variation from the sphere to the rotating ellipsoid, and to the two-robed shapeare simulated, and the elongation of the two-robed droplet is discussed. The two-robed shape after the initial transient is found to be stable and the elongation is almost the same for the cases with different initial rotation rate. The relationship between the elongation and the rotation rate is obtained by averaging the transient shape variation. It is shown that the elongation of two-robed shape is in good agreement with the existing experimental data. It is found that the transient numerical simulation is necessary for analyzing the largely elongated two-robed shape of rotating droplet.

Contourlet versus Wavelet Transform for a Robust Digital Image Watermarking Technique

In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.

A Descent-projection Method for Solving Monotone Structured Variational Inequalities

In this paper, a new descent-projection method with a new search direction for monotone structured variational inequalities is proposed. The method is simple, which needs only projections and some function evaluations, so its computational load is very tiny. Under mild conditions on the problem-s data, the method is proved to converges globally. Some preliminary computational results are also reported to illustrate the efficiency of the method.

Synthesis and Characterization of PEG-Silane Functionalized Iron Oxide Nanoparticle as MRI T2 Contrast Agent

Iron oxide nanoparticle was synthesized by reactive-precipitation method followed by high speed centrifuge and phase transfer in order to stabilized nanoparticles in the solvent. Particle size of SPIO was 8.2 nm by SEM, and the hydraulic radius was 17.5 nm by dynamic light scattering method. Coercivity and saturated magnetism were determined by VSM (vibrating sample magnetometer), coercivity of nanoparticle was lower than 10 Hc, and the saturated magnetism was higher than 65 emu/g. Stabilized SPIO was then transferred to aqueous phase by reacted with excess amount of poly (ethylene glycol) (PEG) silane. After filtration and dialysis, the SPIO T2 contrast agent was ready to use. The hydraulic radius of final product was about 70~100 nm, the relaxation rates R2 (1/T2) measured by magnetic resonance imaging (MRI) was larger than 200(sec-1).

Design a single-phase BLDC Motor and Finite- Element Analysis of Stator Slots Structure Effects on the Efficiency

In this paper effect of stator slots structure and switching angle on a cylindrical single-phase brushless direct current motor (BLDC) is analyzed. BLDC motor with three different structures for stator slots is designed by using RMxprt software and efficiency of BLDC motor for different structures in full-load condition has been presented. Then the BLDC motor in different conditions by using Maxwell 3D software is designed and with finite element method is analyzed electromagnetically. At the end with the use of MATLAB software influence of switching angle on motor performance investigated and optimal angle has been determined. The results indicate that with correct choosing of stator slots structure and switching angle, maximum efficiency can be found.

Lung Nodule Detection in CT Scans

In this paper we describe a computer-aided diagnosis (CAD) system for automated detection of pulmonary nodules in computed-tomography (CT) images. After extracting the pulmonary parenchyma using a combination of image processing techniques, a region growing method is applied to detect nodules based on 3D geometric features. We applied the CAD system to CT scans collected in a screening program for lung cancer detection. Each scan consists of a sequence of about 300 slices stored in DICOM (Digital Imaging and Communications in Medicine) format. All malignant nodules were detected and a low false-positive detection rate was achieved.

On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

Modeling Concave Globoidal Cam with Swinging Roller Follower : A Case Study

This paper describes a computer-aided design for design of the concave globoidal cam with cylindrical rollers and swinging follower. Four models with different modeling methods are made from the same input data. The input data are angular input and output displacements of the cam and the follower and some other geometrical parameters of the globoidal cam mechanism. The best cam model is the cam which has no interference with the rollers when their motions are simulated in assembly conditions. The angular output displacement of the follower for the best cam is also compared with that of in the input data to check errors. In this study, Pro/ENGINEER® Wildfire 2.0 is used for modeling the cam, simulating motions and checking interference and errors of the system.

Shoplifting in Riyadh, Saudi Arabia

the research was conducted using the self report of shoplifters who apprehended in the supermarket while stealing. 943 shoplifters in three years were interviewed right after the stealing act and before calling the police. The aim of the study is to know the shoplifting characteristics in Saudi Arabia, including the trait of shoplifters and the situation of the supermarkets where the stealing takes place. The analysis based on the written information about each thief as the documentary research method. Descriptive statistics as well as some inferential statistics were employed. The result shows that there are differences between genders, age groups, occupations, time of the day, days of the week, months, way of stealing, individual or group of thieves and other supermarket situations in the type of items stolen, total price and the count of items. The result and the recommendation will serve as a guide for retailers where, when and who to look at to prevent shoplifting.

Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems

This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.

Atmosphere Water Vapour As Main Sweet Water Resource in the Arid Zones of Central Asia

It has been shown that the solution of water shortage problem in Central Asia closely connected with inclusion of atmosphere water vapour into the system of response and water resources management. Some methods of water extraction from atmosphere have been discussed.

Reliability Analysis of Underground Pipelines Using Subset Simulation

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms

Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.

Mode III Interlaminar Fracture in Woven Glass/Epoxy Composite Laminates

In the present study, fracture behavior of woven fabric-reinforced glass/epoxy composite laminates under mode III crack growth was experimentally investigated and numerically modeled. Two methods were used for the calculation of the strain energy release rate: the experimental compliance calibration (CC) method and the Virtual Crack Closure Technique (VCCT). To achieve this aim ECT (Edge Crack Torsion) was used to evaluate fracture toughness in mode III loading (out of plane-shear) at different crack lengths. Load–displacement and associated energy release rates were obtained for various case of interest. To calculate fracture toughness JIII, two criteria were considered including non-linearity and maximum points in load-displacement curve and it is observed that JIII increases with the crack length increase. Both the experimental compliance method and the virtual crack closure technique proved applicable for the interpretation of the fracture mechanics data of woven glass/epoxy laminates in mode III.

A New Approach for Prioritization of Failure Modes in Design FMEA using ANOVA

The traditional Failure Mode and Effects Analysis (FMEA) uses Risk Priority Number (RPN) to evaluate the risk level of a component or process. The RPN index is determined by calculating the product of severity, occurrence and detection indexes. The most critically debated disadvantage of this approach is that various sets of these three indexes may produce an identical value of RPN. This research paper seeks to address the drawbacks in traditional FMEA and to propose a new approach to overcome these shortcomings. The Risk Priority Code (RPC) is used to prioritize failure modes, when two or more failure modes have the same RPN. A new method is proposed to prioritize failure modes, when there is a disagreement in ranking scale for severity, occurrence and detection. An Analysis of Variance (ANOVA) is used to compare means of RPN values. SPSS (Statistical Package for the Social Sciences) statistical analysis package is used to analyze the data. The results presented are based on two case studies. It is found that the proposed new methodology/approach resolves the limitations of traditional FMEA approach.