Web Application Security, Attacks and Mitigation

Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.

Performance Comparison of Particle Swarm Optimization with Traditional Clustering Algorithms used in Self-Organizing Map

Self-organizing map (SOM) is a well known data reduction technique used in data mining. It can reveal structure in data sets through data visualization that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOM, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of an adaptive heuristic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOM. The application of our method to several standard data sets demonstrates its feasibility. PSO algorithm utilizes a so-called U-matrix of SOM to determine cluster boundaries; the results of this novel automatic method compare very favorably to boundary detection through traditional algorithms namely k-means and hierarchical based approach which are normally used to interpret the output of SOM.

Effect of Laser Power and Powder Flow Rate on Properties of Laser Metal Deposited Ti6Al4V

Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new part directly from 3 Dimensional Computer Aided Design (3D CAD) model, building new part on the existing old component and repairing an existing high valued component parts that would have been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing parameters and studying many parameters at the same time makes it further complex to understand. In this study, the effect of laser power and powder flow rate on physical properties (deposition height and deposition width), metallurgical property (microstructure) and mechanical (microhardness) properties on laser deposited most widely used aerospace alloy are studied. Also, because the Ti6Al4V is very expensive, and LMD is capable of reducing buy-to-fly ratio of aerospace parts, the material utilization efficiency is also studied. Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and scanning speed constant at 2 l/min and 0.005 m/s respectively. The deposition height / width are found to increase with increase in laser power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces material utilization. The results are presented and fully discussed.

Kernel Matching versus Inverse Probability Weighting: A Comparative Study

Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.

A Fuzzy Logic Based Model to Predict Surface Roughness of A Machined Surface in Glass Milling Operation Using CBN Grinding Tool

Nowadays, the demand for high product quality focuses extensive attention to the quality of machined surface. The (CNC) milling machine facilities provides a wide variety of parameters set-up, making the machining process on the glass excellent in manufacturing complicated special products compared to other machining processes. However, the application of grinding process on the CNC milling machine could be an ideal solution to improve the product quality, but adopting the right machining parameters is required. In glass milling operation, several machining parameters are considered to be significant in affecting surface roughness. These parameters include the lubrication pressure, spindle speed, feed rate and depth of cut. In this research work, a fuzzy logic model is offered to predict the surface roughness of a machined surface in glass milling operation using CBN grinding tool. Four membership functions are allocated to be connected with each input of the model. The predicted results achieved via fuzzy logic model are compared to the experimental result. The result demonstrated settlement between the fuzzy model and experimental results with the 93.103% accuracy.

Multimedia Games for Elementary/Primary School Education and Entertainment

Computers are increasingly being used as educational tools in elementary/primary schools worldwide. A specific application of such computer use, is that of multimedia games, where the aim is to combine pedagogy and entertainment. This study reports on a case-study whereby an educational multimedia game has been developed for use by elementary school children. The stages of the application-s design, implementation and evaluation are presented. Strengths of the game are identified and discussed, and its weaknesses are identified, allowing for suggestions for future redesigns. The results show that the use of games can engage children in the learning process for longer periods of time with the added benefit of the entertainment factor.

Energy Efficient Resource Allocation in Distributed Computing Systems

The problem of mapping tasks onto a computational grid with the aim to minimize the power consumption and the makespan subject to the constraints of deadlines and architectural requirements is considered in this paper. To solve this problem, we propose a solution from cooperative game theory based on the concept of Nash Bargaining Solution. The proposed game theoretical technique is compared against several traditional techniques. The experimental results show that when the deadline constraints are tight, the proposed technique achieves superior performance and reports competitive performance relative to the optimal solution.

Septic B-spline Collocation Method for Solving One-dimensional Hyperbolic Telegraph Equation

Recently, it is found that telegraph equation is more suitable than ordinary diffusion equation in modelling reaction diffusion for such branches of sciences. In this paper, a numerical solution for the one-dimensional hyperbolic telegraph equation by using the collocation method using the septic splines is proposed. The scheme works in a similar fashion as finite difference methods. Test problems are used to validate our scheme by calculate L2-norm and L∞-norm. The accuracy of the presented method is demonstrated by two test problems. The numerical results are found to be in good agreement with the exact solutions.

Solving One-dimensional Hyperbolic Telegraph Equation Using Cubic B-spline Quasi-interpolation

In this paper, the telegraph equation is solved numerically by cubic B-spline quasi-interpolation .We obtain the numerical scheme, by using the derivative of the quasi-interpolation to approximate the spatial derivative of the dependent variable and a low order forward difference to approximate the temporal derivative of the dependent variable. The advantage of the resulting scheme is that the algorithm is very simple so it is very easy to implement. The results of numerical experiments are presented, and are compared with analytical solutions by calculating errors L2 and L∞ norms to confirm the good accuracy of the presented scheme.

Economic Factors Affecting Rice Export of Thailand

The purpose of this study was primarily assessing how important economic factors namely: The Thai export price of white rice, the exchange rate, and the world rice consumption affect the overall Thai white rice export, using historical data during the period 1989-2013 from the Thai Rice Exporters Association, and Food and Agricultural Organization of the United Nations. The co-integration method, regression analysis, and error correction model were applied to investigate the econometric model. The findings indicated that in the long-run, the world rice consumption, the exchange rate, and the Thai export price of white rice were the important factors affecting the export quantity of Thai white rice respectively, as indicated by their significant coefficients. Meanwhile, the rice export price was an important factor affecting the export quantity of Thai white rice in the short-run. This information is useful in the business, export opportunities, price competitiveness, and policymaker in Thailand.

A New Framework and a Model for Product Development with an Application in the Telecommunications Services Sector

This paper argues that a product development exercise involves in addition to the conventional stages, several decisions regarding other aspects. These aspects should be addressed simultaneously in order to develop a product that responds to the customer needs and that helps realize objectives of the stakeholders in terms of profitability, market share and the like. We present a framework that encompasses these different development dimensions. The framework shows that a product development methodology such as the Quality Function Deployment (QFD) is the basic tool which allows definition of the target specifications of a new product. Creativity is the first dimension that enables the development exercise to live and end successfully. A number of group processes need to be followed by the development team in order to ensure enough creativity and innovation. Secondly, packaging is considered to be an important extension of the product. Branding strategies, quality and standardization requirements, identification technologies, design technologies, production technologies and costing and pricing are also integral parts to the development exercise. These dimensions constitute the proposed framework. The paper also presents a mathematical model used to calculate the design targets based on the target costing principle. The framework is used to study a case of a new product development in the telecommunications services sector.

Matrix Based Synthesis of EXOR dominated Combinational Logic for Low Power

This paper discusses a new, systematic approach to the synthesis of a NP-hard class of non-regenerative Boolean networks, described by FON[FOFF]={mi}[{Mi}], where for every mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where 'n' represents the number of distinct primary inputs). The method automatically ensures exact minimization for certain important selfdual functions with 2n-1 points in its one-set. The elements meant for grouping are determined from a newly proposed weighted incidence matrix. Then the binary value corresponding to the candidate pair is correlated with the proposed binary value matrix to enable direct synthesis. We recommend algebraic factorization operations as a post processing step to enable reduction in literal count. The algorithm can be implemented in any high level language and achieves best cost optimization for the problem dealt with, irrespective of the number of inputs. For other cases, the method is iterated to subsequently reduce it to a problem of O(n-1), O(n-2),.... and then solved. In addition, it leads to optimal results for problems exhibiting higher degree of adjacency, with a different interpretation of the heuristic, and the results are comparable with other methods. In terms of literal cost, at the technology independent stage, the circuits synthesized using our algorithm enabled net savings over AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of- Products or ESOP forms) and AND-OR-EXOR logic by 45.57%, 41.78% and 41.78% respectively for the various problems. Circuit level simulations were performed for a wide variety of case studies at 3.3V and 2.5V supply to validate the performance of the proposed method and the quality of the resulting synthesized circuits at two different voltage corners. Power estimation was carried out for a 0.35micron TSMC CMOS process technology. In comparison with AOI logic, the proposed method enabled mean savings in power by 42.46%. With respect to AND-EXOR logic, the proposed method yielded power savings to the tune of 31.88%, while in comparison with AND-OR-EXOR level networks; average power savings of 33.23% was obtained.

Influencing Attitude Change for Sustainability through Persuasion

Food mileage is one of the important issues concerning environmental sustainability. In this research we have utilized a prototype platform with iterative user-centered testing. With these findings we successfully demonstrate the use of the context of persuasive methods to influence users- attitudes towards the sustainable concept.

Geometric and Material Nonlinear Analysis of Reinforced Concrete Structure Considering Soil-Structure Interaction

In the present research, a finite element model is presented to study the geometrical and material nonlinear behavior of reinforced concrete plane frames considering soil-structure interaction. The nonlinear behaviors of concrete and reinforcing steel are considered both in compression and tension up to failure. The model takes account also for the number, diameter, and distribution of rebar along every cross section. Soil behavior is taken into consideration using four different models; namely: linear-, nonlinear Winkler's model, and linear-, nonlinear continuum model. A computer program (NARC) is specially developed in order to perform the analysis. The results achieved by the present model show good agreement with both theoretical and experimental published literature. The nonlinear behavior of a rectangular frame resting on soft soil up to failure using the proposed model is introduced for demonstration.

Analysis of the Ambient Media Approach of Advertisement Samples from the Adman Awards and Symposium under the Category of Outdoor and Ambience

This research is to study the types of products and services that employs 'ambient media and respective techniques in its advertisement materials. Data collection has been done via analyses of a total of 62 advertisements that employed ambient media approach in Thailand during the years 2004 to 2011. The 62 advertisement were qualifying advertisements of the Adman Awards & Symposium under the category of Outdoor & Ambience. Analysis results reveal that there is a total of 14 products and services that chooses to utilize ambient media in its advertisement. Amongst all ambient media techniques, 'intrusion' uses the value of a medium in its representation of content most often. Following intrusion is 'interaction', where consumers are invited to participate and interact with the advertising materials. 'Illusion' ranks third in its ability to subject the viewers to distortions of reality that makes the division between reality and fantasy less clear.

Gender Perspective Considerations in Disasters like Earthquakes and Floods of Pakistan

From past many decades human beings are suffering from plethora of natural disasters. Occurrence of disasters is a frequent process; it changes conceptual myths as more and more advancement are made. Although we are living in technological era but in developing countries like Pakistan disasters are shaped by socially constructed roles. The need is to understand the most vulnerable group of society i.e. females; their issues are complex in nature because of undermined gender status in the society. There is a need to identify maximum issues regarding females and to enhance the achievement of millennium development goals (MDGs). Gender issues are of great concern all around the globe including Pakistan. Here female visibility in society is low, and also during disasters, the failure to understand the reality that concentrates on double burden including productive and reproductive care. Women have to contribute a lot in society so we need to make them more disaster resilient. For this non-structural measures like awareness, trainings and education must be carried out. In rural and in urban settings in any disaster like earthquake or flood, elements like gender perspective, their age, physical health, demographic issues contribute towards vulnerability. In Pakistan the gender issues in disasters were of less concern before 2005 earthquake and 2010 floods. Significant achievements are made after 2010 floods when gender and child cell was created to provide all facilities to women and girls. The aim of the study is to highlight all necessary facilities in a disaster to build coping mechanism in females from basic rights till advance level including education.

A Technique for Improving the Performance of Median Smoothers at the Corners Characterized by Low Order Polynomials

Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.

Determine of Constant Coefficients to RelateTotal Dissolved Solids to Electrical Conductivity

Salinity is a measure of the amount of salts in the water. Total Dissolved Solids (TDS) as salinity parameter are often determined using laborious and time consuming laboratory tests, but it may be more appropriate and economical to develop a method which uses a more simple soil salinity index. Because dissolved ions increase salinity as well as conductivity, the two measures are related. The aim of this research was determine of constant coefficients for predicting of Total Dissolved Solids (TDS) based on Electrical Conductivity (EC) with Statistics of Correlation coefficient, Root mean square error, Maximum error, Mean Bias error, Mean absolute error, Relative error and Coefficient of residual mass. For this purpose, two experimental areas (S1, S2) of Khuzestan province-IRAN were selected and four treatments with three replications by series of double rings were applied. The treatments were included 25cm, 50cm, 75cm and 100cm water application. The results showed the values 16.3 & 12.4 were the best constant coefficients for predicting of Total Dissolved Solids (TDS) based on EC in Pilot S1 and S2 with correlation coefficient 0.977 & 0.997 and 191.1 & 106.1 Root mean square errors (RMSE) respectively.

Computer-Based Assessment of Pre-assigned Individual Education Plans in Special Education

Assessment of IEP (Individual Education Plan) is an important stage in the area of special education. This paper deals with this problem by introducing computer software which process the data gathered from application of IEP. The software is intended to be used by special education institution in Turkey and allows assessment of school and family trainings. The software has a user friendly interface and its design includes graphical developer tools.

Fuzzy Fingerprint Vault using Multiple Polynomials

Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.