Moving towards Positive Security Model for Web Application Firewall

The proliferation of web application and the pervasiveness of mobile technology make web-based attacks even more attractive and even easier to launch. Web Application Firewall (WAF) is an intermediate tool between web server and users that provides comprehensive protection for web application. WAF is a negative security model where the detection and prevention mechanisms are based on predefined or user-defined attack signatures and patterns. However, WAF alone is not adequate to offer best defensive system against web vulnerabilities that are increasing in number and complexity daily. This paper presents a methodology to automatically design a positive security based model which identifies and allows only legitimate web queries. The paper shows a true positive rate of more than 90% can be achieved.

Comparative Study of Three DGS Unit Shapes and Compact Microstrip Low-Pass and Band-Pass Filters Designs

In this paper, three types of defected ground structure (DGS) units which are triangular-head (TH), rectangular-head (RH) and U-shape (US) are investigated. They are further used to low-pass and band-pass filters designs (LPF and BPF) and the obtained performances are examined. The LPF employing RH-DGS geometry presents the advantages of compact size, low-insertion loss and wide stopband compared to the other filters. It provides cutoff frequency of 2.5 GHz, largest rejection band width of 20 dB from 2.98 to 8.76 GHz, smallest transition region and smallest sharpness of the cutoff frequency. The BPF based on RH-DGS has the highest bandwidth (BW) of about 0.74 GHz and the lowest center frequency of 3.24 GHz, whereas the other BPFs have BWs less than 0.7 GHz.

An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)

System development life cycle (SDLC) is a process uses during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. A consistency of the context diagram to lower-level data flow diagrams is very important in smoothing up developing process of a system. However, manual consistency check from context diagram to lower-level data flow diagrams by using a checklist is time-consuming process. At the same time, the limitation of human ability to validate the errors is one of the factors that influence the correctness and balancing of the diagrams. This paper presents a tool that automates the consistency check between Data Flow Diagrams (DFDs) based on the rules of DFDs. The tool serves two purposes: as an editor to draw the diagrams and as a checker to check the correctness of the diagrams drawn. The consistency check from context diagram to lower-level data flow diagrams is embedded inside the tool to overcome the manual checking problem.

Use of Persuasive Technology to Change End-Users- IT Security Aware Behaviour: A Pilot Study

Persuasive technology has been applied in marketing, health, environmental conservation, safety and other domains and is found to be quite effective in changing people-s attitude and behaviours. This research extends the application domains of persuasive technology to information security awareness and uses a theory-driven approach to evaluate the effectiveness of a web-based program developed based on the principles of persuasive technology to improve the information security awareness of end users. The findings confirm the existence of a very strong effect of the webbased program in raising users- attitude towards information security aware behavior. This finding is useful to the IT researchers and practitioners in developing appropriate and effective education strategies for improving the information security attitudes for endusers.

Inventory Control for a Joint Replenishment Problem with Stochastic Demand

Most papers model Joint Replenishment Problem (JRP) as a (kT,S) where kT is a multiple value for a common review period T,and S is a predefined order up to level. In general the (T,S) policy is characterized by a long out of control period which requires a large amount of safety stock compared to the (R,Q) policy. In this paper a probabilistic model is built where an item, call it item(i), with the shortest order time between interval (T)is modeled under (R,Q) policy and its inventory is continuously reviewed, while the rest of items (j) are periodically reviewed at a definite time corresponding to item

Wind Speed Data Analysis using Wavelet Transform

Renewable energy systems are becoming a topic of great interest and investment in the world. In recent years wind power generation has experienced a very fast development in the whole world. For planning and successful implementations of good wind power plant projects, wind potential measurements are required. In these projects, of great importance is the effective choice of the micro location for wind potential measurements, installation of the measurement station with the appropriate measuring equipment, its maintenance and analysis of the gained data on wind potential characteristics. In this paper, a wavelet transform has been applied to analyze the wind speed data in the context of insight in the characteristics of the wind and the selection of suitable locations that could be the subject of a wind farm construction. This approach shows that it can be a useful tool in investigation of wind potential.

Inclusion of Enterococcus Faecalis and Enterococcus Faecium to UF White Cheese

Lighvan cheese is basically made from sheep milk in the area of Sahand mountainside which is located in the North West of Iran. The main objective of this study was to investigate the effect of enterococci isolated from traditional Lighvan cheese on the quality of Iranian UF white during ripening. The experimental design was split plot based on randomized complete blocks, main plots were four types of starters and subplots were different ripening durations. Addition of Enterococcus spp. did not significantly (P

A High Bitrate Information Hiding Algorithm for Video in Video

In high bitrate information hiding techniques, 1 bit is embedded within each 4 x 4 Discrete Cosine Transform (DCT) coefficient block by means of vector quantization, then the hidden bit can be effectively extracted in terminal end. In this paper high bitrate information hiding algorithms are summarized, and the scheme of video in video is implemented. Experimental result shows that the host video which is embedded numerous auxiliary information have little visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host video only degrades 0.22dB in average, while the hidden information has a high percentage of survives and keeps a high robustness in H.264/AVC compression, the average Bit Error Rate(BER) of hiding information is 0.015%.

Modeling and Investigation of Elongation in Free Explosive Forming of Aluminum Alloy Plate

Because of high ductility, aluminum alloys, have been widely used as an important base of metal forming industries. But the main week point of these alloys is their low strength so in forming them with conventional methods like deep drawing, hydro forming, etc have been always faced with problems like fracture during of forming process. Because of this, recently using of explosive forming method for forming of these plates has been recommended. In this paper free explosive forming of A2024 aluminum alloy is numerically simulated and during it, explosion wave propagation process is studied. Consequences of this simulation can be effective in prediction of quality of production. These consequences are compared with an experimental test and show the superiority of this method to similar methods like hydro forming and deep drawing.

The Variation of Software Development Productivity 1995-2005

Software development has experienced remarkable progress in the past decade. However, due to the rising complexity and magnitude of the project the development productivity has not been consistently improved. By analyzing the latest ISBSG data repository with 4106 projects, we discovered that software development productivity has actually undergone irregular variations between the years 1995 and 2005. Considering the factors significant to the productivity, we found its variations are primarily caused by the variations of average team size and the unbalanced uses of the less productive language 3GL.

An Index based Forward Backward Multiple Pattern Matching Algorithm

Pattern matching is one of the fundamental applications in molecular biology. Searching DNA related data is a common activity for molecular biologists. In this paper we explore the applicability of a new pattern matching technique called Index based Forward Backward Multiple Pattern Matching algorithm(IFBMPM), for DNA Sequences. Our approach avoids unnecessary comparisons in the DNA Sequence due to this; the number of comparisons of the proposed algorithm is very less compared to other existing popular methods. The number of comparisons rapidly decreases and execution time decreases accordingly and shows better performance.

Development of Admire Longitudinal Quasi-Linear Model by using State Transformation Approach

This paper presents a longitudinal quasi-linear model for the ADMIRE model. The ADMIRE model is a nonlinear model of aircraft flying in the condition of high angle of attack. So it can-t be considered to be a linear system approximately. In this paper, for getting the longitudinal quasi-linear model of the ADMIRE, a state transformation based on differentiable functions of the nonscheduling states and control inputs is performed, with the goal of removing any nonlinear terms not dependent on the scheduling parameter. Since it needn-t linear approximation and can obtain the exact transformations of the nonlinear states, the above-mentioned approach is thought to be appropriate to establish the mathematical model of ADMIRE. To verify this conclusion, simulation experiments are done. And the result shows that this quasi-linear model is accurate enough.

Agent Decision using Granular Computing in Traffic System

In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.

Exploiting Global Self Similarity for Head-Shoulder Detection

People detection from images has a variety of applications such as video surveillance and driver assistance system, but is still a challenging task and more difficult in crowded environments such as shopping malls in which occlusion of lower parts of human body often occurs. Lack of the full-body information requires more effective features than common features such as HOG. In this paper, new features are introduced that exploits global self-symmetry (GSS) characteristic in head-shoulder patterns. The features encode the similarity or difference of color histograms and oriented gradient histograms between two vertically symmetric blocks. The domain-specific features are rapid to compute from the integral images in Viola-Jones cascade-of-rejecters framework. The proposed features are evaluated with our own head-shoulder dataset that, in part, consists of a well-known INRIA pedestrian dataset. Experimental results show that the GSS features are effective in reduction of false alarmsmarginally and the gradient GSS features are preferred more often than the color GSS ones in the feature selection.

An Assessment of Technological Competencies on Professional Service Firms Business Performance

This study was initiated with a three prong objective. One, to identify the relationship between Technological Competencies factors (Technical Capability, Firm Innovativeness and E-Business Practices and professional service firms- business performance. To investigate the predictors of professional service firms business performance and finally to evaluate the predictors of business performance according to the type of professional service firms, a survey questionnaire was deployed to collect empirical data. The questionnaire was distributed to the owners of the professional small medium size enterprises services in the Accounting, Legal, Engineering and Architecture sectors. Analysis showed that all three Technology Competency factors have moderate effect on business performance. In addition, the regression models indicate that technical capability is the most highly influential that could determine business performance, followed by e-business practices and firm innovativeness. Subsequently, the main predictor of business performance for all types of firms is Technical capability.

Simulation Modeling of Manufacturing Systems for the Serial Route and the Parallel One

In the paper we discuss the influence of the route flexibility degree, the open rate of operations and the production type coefficient on makespan. The flexible job-open shop scheduling problem FJOSP (an extension of the classical job shop scheduling) is analyzed. For the analysis of the production process we used a hybrid heuristic of the GRASP (greedy randomized adaptive search procedure) with simulated annealing algorithm. Experiments with different levels of factors have been considered and compared. The GRASP+SA algorithm has been tested and illustrated with results for the serial route and the parallel one.

Separation of Manganese and Cadmium from Cobalt Electrolyte Solution by Solvent Extraction

Impurity metals such as manganese and cadmium from high-tenor cobalt electrolyte solution were selectively removed by solvent extraction method using Co-D2EHPA after converting the functional group of D2EHPA with Co2+ ions. The process parameters such as pH, organic concentration, O/A ratio, kinetics etc. were investigated and the experiments were conducted by batch tests in the laboratory bench scale. Results showed that a significant amount of manganese and cadmium can be extracted using Co-D2EHPA for the optimum processing of cobalt electrolyte solution at equilibrium pH about 3.5. The McCabe-Thiele diagram, constructed from the extraction studies showed that 100% impurities can be extracted through four stages for manganese and three stages for cadmium using O/A ratio of 0.65 and 1.0, respectively. From the stripping study, it was found that 100% manganese and cadmium can be stripped from the loaded organic using 0.4 M H2SO4 in a single contact. The loading capacity of Co-D2EHPA by manganese and cadmium were also investigated with different O/A ratio as well as with number of stages of contact of aqueous and organic phases. Valuable information was obtained for the designing of an impurities removal process for the production of pure cobalt with less trouble in the electrowinning circuit.

Characteristics of Hemodynamics in a Bileaflet Mechanical Heart Valve using an Implicit FSI Method

Human heart valves diseased by congenital heart defects, rheumatic fever, bacterial infection, cancer may cause stenosis or insufficiency in the valves. Treatment may be with medication but often involves valve repair or replacement (insertion of an artificial heart valve). Bileaflet mechanical heart valves (BMHVs) are widely implanted to replace the diseased heart valves, but still suffer from complications such as hemolysis, platelet activation, tissue overgrowth and device failure. These complications are closely related to both flow characteristics through the valves and leaflet dynamics. In this study, the physiological flow interacting with the moving leaflets in a bileaflet mechanical heart valve (BMHV) is simulated with a strongly coupled implicit fluid-structure interaction (FSI) method which is newly organized based on the Arbitrary-Lagrangian-Eulerian (ALE) approach and the dynamic mesh method (remeshing) of FLUENT. The simulated results are in good agreement with previous experimental studies. This study shows the applicability of the present FSI model to the complicated physics interacting between fluid flow and moving boundary.

The Framework of BeeBot: Binus Multi-Client of Intelligent Telepresence Robot

We present a BeeBot, Binus Multi-client Intelligent Telepresence Robot, a custom-build robot system specifically designed for teleconference with multiple person using omni directional actuator. The robot is controlled using a computer networks, so the manager/supervisor can direct the robot to the intended person to start a discussion/inspection. People tracking and autonomous navigation are intelligent features of this robot. We build a web application for controlling the multi-client telepresence robot and open-source teleconference system used. Experimental result presented and we evaluated its performance.

A Third Drop Level For TCP-RED Congestion Control Strategy

This work presents the Risk Threshold RED (RTRED) congestion control strategy for TCP networks. In addition to the maximum and minimum thresholds in existing RED-based strategies, we add a third dropping level. This new dropping level is the risk threshold which works with the actual and average queue sizes to detect the immediate congestion in gateways. Congestion reaction by RTRED is on time. The reaction to congestion is neither too early, to avoid unfair packet losses, nor too late to avoid packet dropping from time-outs. We compared our novel strategy with RED and ARED strategies for TCP congestion handling using a NS-2 simulation script. We found that the RTRED strategy outperformed RED and ARED.