Abstract: Bicycle usage for exercise, recreation, and commuting
to work in Australia shows that pedal cycling is the fourth most
popular activity with 10.6% increase in participants between 2001
and 2007. As with other means of transport, accident and injury
becomes common although mandatory bicycle helmet wearing has
been introduced. The research aims to develop a face surrogate made
of sandwich of rigid foam and rubber sheets to represent human
facial bone under blunt impact. The facial surrogate will serve as an
important test device for further development of facial-impact
protection for cyclist. A test procedure was developed to simulate the
energy of impact and record data to evaluate the effect of impact on
facial bones. Drop tests were performed to establish a suitable
combination of materials. It was found that the sandwich structure of
rigid extruded-polystyrene foam (density of 40 kg/m3 with a pattern
of 6-mm-holes), Neoprene rubber sponge, and Abrasaflex rubber
backing, had impact characteristics comparable to that of human
facial bone. In particular, the foam thickness of 30 mm and 25 mm
was found suitable to represent human zygoma (cheekbone) and
maxilla (upper-jaw bone), respectively.
Abstract: In this paper we analyze the application of a formal proof system to the discrete logarithm problem used in publickey cryptography. That means, we explore a computer verification of the ElGamal encryption scheme with the formal proof system Isabelle/HOL. More precisely, the functional correctness of this algorithm is formally verified with computer support. Besides, we present a formalization of the DSA signature scheme in the Isabelle/HOL system. We show that this scheme is correct what is a necessary condition for the usefulness of any cryptographic signature scheme.
Abstract: In China, with the rapid urbanization and
industrialization, and highly accelerated economic development have
resulted in degradation of water resource. The water quality
deterioration usual result from eutrophication in most cases, so how to
dispose this type pollution water higher efficiently is an urgent task.
Hower, different with traditional technology, constructed wetlands are
effective treatment systems that can be very useful because they are
simple technology and low operational cost. A pilot-scale treatment
including constructed wetlands was constructed at XingYun Lake,
Yuxi, China, and operated as primary treatment measure before
eutrophic-lake water draining to riverine landscape. Water quality
indices were determined during the experiment, the results indicated
that treatment removal efficiencies were high for Nitrate nitrogen,
Chlorophyll–a and Algae, the final removal efficiency reached to
95.20%, 93.33% and 99.87% respectively, but the removal efficiency
of Total phosphorous and Total nitrogen only reach to 68.83% and
50.00% respectively.
Abstract: For a generalized Hermite sinosiodal / hyperbolic Gaussian beam passing through an ABCD system with a finite aperture, the propagation properties are derived using the Collins integral. The results are obtained in the form of intensity graphs indicating that previously demonstrated rules of reciprocity are applicable, while the existence of the aperture accelerates this transformation.
Abstract: In this study is presented a general methodology to
predict the performance of a continuous near-critical fluid extraction
process to remove compounds from aqueous solutions using hollow
fiber membrane contactors. A comprehensive 2D mathematical
model was developed to study Porocritical extraction process. The
system studied in this work is a membrane based extractor of ethanol
and acetone from aqueous solutions using near-critical CO2.
Predictions of extraction percentages obtained by simulations have
been compared to the experimental values reported by Bothun et al.
[5]. Simulations of extraction percentage of ethanol and acetone
show an average difference of 9.3% and 6.5% with the experimental
data, respectively. More accurate predictions of the extraction of
acetone could be explained by a better estimation of the transport
properties in the aqueous phase that controls the extraction of this
solute.
Abstract: Distillation column is one of the most common
operations in process industries and is while the most expensive unit
of the amount of energy consumption. Many ideas have been
presented in the related literature for optimizing energy consumption
in distillation columns. This paper studies the different heat
integration methods in a distillation column which separate Benzene,
Toluene, Xylene, and C9+. Three schemes of heat integration
including, indirect sequence (IQ), indirect sequence with forward
energy integration (IQF), and indirect sequence with backward
energy integration (IQB) has been studied in this paper. Using
shortcut method these heat integration schemes were simulated with
Aspen HYSYS software and compared with each other with
regarding economic considerations. The result shows that the energy
consumption has been reduced 33% in IQF and 28% in IQB in
comparison with IQ scheme. Also the economic result shows that the
total annual cost has been reduced 12% in IQF and 8% in IQB
regarding with IQ scheme. Therefore, the IQF scheme is most
economic than IQB and IQ scheme.
Abstract: Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.
Abstract: The present paper discusses the selection of process
parameters for obtaining optimal nanocrystallites size in the CuOZrO2
catalyst. There are some parameters changing the inorganic
structure which have an influence on the role of hydrolysis and
condensation reaction. A statistical design test method is
implemented in order to optimize the experimental conditions of
CuO-ZrO2 nanoparticles preparation. This method is applied for the
experiments and L16 orthogonal array standard. The crystallites size
is considered as an index. This index will be used for the analysis in
the condition where the parameters vary. The effect of pH, H2O/
precursor molar ratio (R), time and temperature of calcination,
chelating agent and alcohol volume are particularity investigated
among all other parameters. In accordance with the results of
Taguchi, it is found that temperature has the greatest impact on the
particle size. The pH and H2O/ precursor molar ratio have low
influences as compared with temperature. The alcohol volume as
well as the time has almost no effect as compared with all other
parameters. Temperature also has an influence on the morphology
and amorphous structure of zirconia. The optimal conditions are
determined by using Taguchi method. The nanocatalyst is studied by
DTA-TG, XRD, EDS, SEM and TEM. The results of this research
indicate that it is possible to vary the structure, morphology and
properties of the sol-gel by controlling the above-mentioned
parameters.
Abstract: Self-organizing map (SOM) is a well known data
reduction technique used in data mining. It can reveal structure in
data sets through data visualization that is otherwise hard to detect
from raw data alone. However, interpretation through visual
inspection is prone to errors and can be very tedious. There are
several techniques for the automatic detection of clusters of code
vectors found by SOM, but they generally do not take into account
the distribution of code vectors; this may lead to unsatisfactory
clustering and poor definition of cluster boundaries, particularly
where the density of data points is low. In this paper, we propose the
use of an adaptive heuristic particle swarm optimization (PSO)
algorithm for finding cluster boundaries directly from the code
vectors obtained from SOM. The application of our method to
several standard data sets demonstrates its feasibility. PSO algorithm
utilizes a so-called U-matrix of SOM to determine cluster boundaries;
the results of this novel automatic method compare very favorably to
boundary detection through traditional algorithms namely k-means
and hierarchical based approach which are normally used to interpret
the output of SOM.
Abstract: Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It-s well-known that reproducing kernel (R.K) is a useful kernel function which possesses many properties, e.g. positive definiteness, reproducing property and composing complex R.K by simple operation. There are two popular ways to compute the R.K with explicit form. One is to construct and solve a specific differential equation with boundary value whose handicap is incapable of obtaining a unified form of R.K. The other is using a piecewise integral of the Green function associated with a differential operator L. The latter benefits the computation of a R.K with a unified explicit form and theoretical analysis, whereas there are relatively later studies and fewer practical computations. In this paper, a new algorithm for computing a R.K is presented. It can obtain the unified explicit form of R.K in general reproducing kernel Hilbert space. It avoids constructing and solving the complex differential equations manually and benefits an automatic, flexible and rigorous computation for more general RKHS. In order to validate that the R.K computed by the algorithm can be used in SVM well, some illustrative examples and a comparison between R.K and Gaussian kernel (RBF) in support vector regression are presented. The result shows that the performance of R.K is close or slightly superior to that of RBF.
Abstract: Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new
part directly from 3 Dimensional Computer Aided Design (3D CAD)
model, building new part on the existing old component and repairing an existing high valued component parts that would have
been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the
underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing
parameters and studying many parameters at the same time makes it
further complex to understand. In this study, the effect of laser power
and powder flow rate on physical properties (deposition height and
deposition width), metallurgical property (microstructure) and
mechanical (microhardness) properties on laser deposited most
widely used aerospace alloy are studied. Also, because the Ti6Al4V
is very expensive, and LMD is capable of reducing buy-to-fly ratio
of aerospace parts, the material utilization efficiency is also studied.
Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow
rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and
scanning speed constant at 2 l/min and 0.005 m/s respectively. The
deposition height / width are found to increase with increase in laser
power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces
material utilization. The results are presented and fully discussed.
Abstract: Medical images require special safety and confidentiality because critical judgment is done on the information provided by medical images. Transmission of medical image via internet or mobile phones demands strong security and copyright protection in telemedicine applications. Here, highly secured and robust watermarking technique is proposed for transmission of image data via internet and mobile phones. The Region of Interest (ROI) and Non Region of Interest (RONI) of medical image are separated. Only RONI is used for watermark embedding. This technique results in exact recovery of watermark with standard medical database images of size 512x512, giving 'correlation factor' equals to 1. The correlation factor for different attacks like noise addition, filtering, rotation and compression ranges from 0.90 to 0.95. The PSNR with weighting factor 0.02 is up to 48.53 dBs. The presented scheme is non blind and embeds hospital logo of 64x64 size.
Abstract: Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.
Abstract: R&D risk management has been suggested as one of
the management approaches for accomplishing the goals of public
R&D investment. The investment in basic science and core technology
development is the essential roles of government for securing the
social base needed for continuous economic growth. And, it is also an
important role of the science and technology policy sectors to generate
a positive environment in which the outcomes of public R&D can be
diffused in a stable fashion by controlling the uncertainties and risk
factors in advance that may arise during the application of such
achievements to society and industry. Various policies have already
been implemented to manage uncertainties and variables that may
have negative impact on accomplishing public R& investment goals.
But we may derive new policy measures for complementing the
existing policies and for exploring progress direction by analyzing
them in a policy package from the viewpoint of R&D risk
management.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: Computers are increasingly being used as educational
tools in elementary/primary schools worldwide. A specific
application of such computer use, is that of multimedia games, where
the aim is to combine pedagogy and entertainment. This study
reports on a case-study whereby an educational multimedia game has
been developed for use by elementary school children. The stages of
the application-s design, implementation and evaluation are
presented. Strengths of the game are identified and discussed, and its
weaknesses are identified, allowing for suggestions for future redesigns.
The results show that the use of games can engage children
in the learning process for longer periods of time with the added
benefit of the entertainment factor.
Abstract: The present paper develops and validates a numerical procedure for the calculation of turbulent combustive flow in converging and diverging ducts and throuh simulation of the heat transfer processes, the amount of production and spread of Nox pollutant has been measured. A marching integration solution procedure employing the TDMA is used to solve the discretized equations. The turbulence model is the Prandtl Mixing Length method. Modeling the combustion process is done by the use of Arrhenius and Eddy Dissipation method. Thermal mechanism has been utilized for modeling the process of forming the nitrogen oxides. Finite difference method and Genmix numerical code are used for numerical solution of equations. Our results indicate the important influence of the limiting diverging angle of diffuser on the coefficient of recovering of pressure. Moreover, due to the intense dependence of Nox pollutant to the maximum temperature in the domain with this feature, the Nox pollutant amount is also in maximum level.
Abstract: The problem of mapping tasks onto a computational grid with the aim to minimize the power consumption and the makespan subject to the constraints of deadlines and architectural requirements is considered in this paper. To solve this problem, we propose a solution from cooperative game theory based on the concept of Nash Bargaining Solution. The proposed game theoretical technique is compared against several traditional techniques. The experimental results show that when the deadline constraints are tight, the proposed technique achieves superior performance and reports competitive performance relative to the optimal solution.
Abstract: This paper presents comparative emission study of
newly introduced gasoline/LPG bifuel automotive engine in Indian
market. Emissions were tested as per LPG-Bharat stage III driving
cycle. Emission tests were carried out for urban cycle and extra urban
cycle. Total time for urban and extra urban cycle was 1180 sec.
Engine was run in LPG mode by using conversion system. Emissions
were tested as per standard procedure and were compared. Corrected
emissions were computed by deducting ambient reading from sample
reading. Paper describes detail emission test procedure and results
obtained. CO emissions were in the range of38.9 to 111.3 ppm. HC
emissions were in the range of 18.2 to 62.6 ppm. Nox emissions were
08 to 3.9 ppm and CO2 emissions were from 6719.2 to 8051 ppm.
Paper throws light on emission results of LPG vehicles recently
introduced in Indian automobile market. Objectives of this
experimental study were to measure emissions of engines in gasoline
& LPG mode and compare them.
Abstract: In this paper, the telegraph equation is solved numerically by cubic B-spline quasi-interpolation .We obtain the numerical scheme, by using the derivative of the quasi-interpolation to approximate the spatial derivative of the dependent variable and a low order forward difference to approximate the temporal derivative of the dependent variable. The advantage of the resulting scheme is that the algorithm is very simple so it is very easy to implement. The results of numerical experiments are presented, and are compared with analytical solutions by calculating errors L2 and L∞ norms to confirm the good accuracy of the presented scheme.