Abstract: Distillation column is one of the most common
operations in process industries and is while the most expensive unit
of the amount of energy consumption. Many ideas have been
presented in the related literature for optimizing energy consumption
in distillation columns. This paper studies the different heat
integration methods in a distillation column which separate Benzene,
Toluene, Xylene, and C9+. Three schemes of heat integration
including, indirect sequence (IQ), indirect sequence with forward
energy integration (IQF), and indirect sequence with backward
energy integration (IQB) has been studied in this paper. Using
shortcut method these heat integration schemes were simulated with
Aspen HYSYS software and compared with each other with
regarding economic considerations. The result shows that the energy
consumption has been reduced 33% in IQF and 28% in IQB in
comparison with IQ scheme. Also the economic result shows that the
total annual cost has been reduced 12% in IQF and 8% in IQB
regarding with IQ scheme. Therefore, the IQF scheme is most
economic than IQB and IQ scheme.
Abstract: Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.
Abstract: The present paper discusses the selection of process
parameters for obtaining optimal nanocrystallites size in the CuOZrO2
catalyst. There are some parameters changing the inorganic
structure which have an influence on the role of hydrolysis and
condensation reaction. A statistical design test method is
implemented in order to optimize the experimental conditions of
CuO-ZrO2 nanoparticles preparation. This method is applied for the
experiments and L16 orthogonal array standard. The crystallites size
is considered as an index. This index will be used for the analysis in
the condition where the parameters vary. The effect of pH, H2O/
precursor molar ratio (R), time and temperature of calcination,
chelating agent and alcohol volume are particularity investigated
among all other parameters. In accordance with the results of
Taguchi, it is found that temperature has the greatest impact on the
particle size. The pH and H2O/ precursor molar ratio have low
influences as compared with temperature. The alcohol volume as
well as the time has almost no effect as compared with all other
parameters. Temperature also has an influence on the morphology
and amorphous structure of zirconia. The optimal conditions are
determined by using Taguchi method. The nanocatalyst is studied by
DTA-TG, XRD, EDS, SEM and TEM. The results of this research
indicate that it is possible to vary the structure, morphology and
properties of the sol-gel by controlling the above-mentioned
parameters.
Abstract: Self-organizing map (SOM) is a well known data
reduction technique used in data mining. It can reveal structure in
data sets through data visualization that is otherwise hard to detect
from raw data alone. However, interpretation through visual
inspection is prone to errors and can be very tedious. There are
several techniques for the automatic detection of clusters of code
vectors found by SOM, but they generally do not take into account
the distribution of code vectors; this may lead to unsatisfactory
clustering and poor definition of cluster boundaries, particularly
where the density of data points is low. In this paper, we propose the
use of an adaptive heuristic particle swarm optimization (PSO)
algorithm for finding cluster boundaries directly from the code
vectors obtained from SOM. The application of our method to
several standard data sets demonstrates its feasibility. PSO algorithm
utilizes a so-called U-matrix of SOM to determine cluster boundaries;
the results of this novel automatic method compare very favorably to
boundary detection through traditional algorithms namely k-means
and hierarchical based approach which are normally used to interpret
the output of SOM.
Abstract: Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It-s well-known that reproducing kernel (R.K) is a useful kernel function which possesses many properties, e.g. positive definiteness, reproducing property and composing complex R.K by simple operation. There are two popular ways to compute the R.K with explicit form. One is to construct and solve a specific differential equation with boundary value whose handicap is incapable of obtaining a unified form of R.K. The other is using a piecewise integral of the Green function associated with a differential operator L. The latter benefits the computation of a R.K with a unified explicit form and theoretical analysis, whereas there are relatively later studies and fewer practical computations. In this paper, a new algorithm for computing a R.K is presented. It can obtain the unified explicit form of R.K in general reproducing kernel Hilbert space. It avoids constructing and solving the complex differential equations manually and benefits an automatic, flexible and rigorous computation for more general RKHS. In order to validate that the R.K computed by the algorithm can be used in SVM well, some illustrative examples and a comparison between R.K and Gaussian kernel (RBF) in support vector regression are presented. The result shows that the performance of R.K is close or slightly superior to that of RBF.
Abstract: Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.
Abstract: R&D risk management has been suggested as one of
the management approaches for accomplishing the goals of public
R&D investment. The investment in basic science and core technology
development is the essential roles of government for securing the
social base needed for continuous economic growth. And, it is also an
important role of the science and technology policy sectors to generate
a positive environment in which the outcomes of public R&D can be
diffused in a stable fashion by controlling the uncertainties and risk
factors in advance that may arise during the application of such
achievements to society and industry. Various policies have already
been implemented to manage uncertainties and variables that may
have negative impact on accomplishing public R& investment goals.
But we may derive new policy measures for complementing the
existing policies and for exploring progress direction by analyzing
them in a policy package from the viewpoint of R&D risk
management.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: In this paper, we first show a relationship between two
stabilizing controllers, which presents an extended feedback system
using two stabilizing controllers. Then, we apply this relationship to
the two-stage compensator design. In this paper, we consider singleinput
single-output plants. On the other hand, we do not assume the
coprime factorizability of the model. Thus, the results of this paper
are based on the factorization approach only, so that they can be
applied to numerous linear systems.
Abstract: This paper presents comparative emission study of
newly introduced gasoline/LPG bifuel automotive engine in Indian
market. Emissions were tested as per LPG-Bharat stage III driving
cycle. Emission tests were carried out for urban cycle and extra urban
cycle. Total time for urban and extra urban cycle was 1180 sec.
Engine was run in LPG mode by using conversion system. Emissions
were tested as per standard procedure and were compared. Corrected
emissions were computed by deducting ambient reading from sample
reading. Paper describes detail emission test procedure and results
obtained. CO emissions were in the range of38.9 to 111.3 ppm. HC
emissions were in the range of 18.2 to 62.6 ppm. Nox emissions were
08 to 3.9 ppm and CO2 emissions were from 6719.2 to 8051 ppm.
Paper throws light on emission results of LPG vehicles recently
introduced in Indian automobile market. Objectives of this
experimental study were to measure emissions of engines in gasoline
& LPG mode and compare them.
Abstract: This paper argues that a product development exercise
involves in addition to the conventional stages, several decisions
regarding other aspects. These aspects should be addressed
simultaneously in order to develop a product that responds to the
customer needs and that helps realize objectives of the stakeholders
in terms of profitability, market share and the like. We present a
framework that encompasses these different development
dimensions. The framework shows that a product development
methodology such as the Quality Function Deployment (QFD) is the
basic tool which allows definition of the target specifications of a
new product. Creativity is the first dimension that enables the
development exercise to live and end successfully. A number of
group processes need to be followed by the development team in
order to ensure enough creativity and innovation. Secondly,
packaging is considered to be an important extension of the product.
Branding strategies, quality and standardization requirements,
identification technologies, design technologies, production
technologies and costing and pricing are also integral parts to the
development exercise. These dimensions constitute the proposed
framework. The paper also presents a mathematical model used to
calculate the design targets based on the target costing principle. The
framework is used to study a case of a new product development in
the telecommunications services sector.
Abstract: Organizational structure of the Turkish state
universities is a form of bureaucracy, a high efficient system in
rational and formal control. According to the dimensional approach
bureaucracy can occur in an organization in a degree, as some
bureaucracy characteristics can be stronger than others. In addition,
the units of an organization due to their different specific
characteristic properties can perceive the bureaucracy differently. In
the study, Hall-s Organizational Inventory, which was developed for
evaluating the degree of bureaucratization from the dimensional
perspective, is used to find out if there is a difference in the
perception of the bureaucracy between the academicians working in
three different departments and two faculties in the same university.
Abstract: From past many decades human beings are suffering
from plethora of natural disasters. Occurrence of disasters is a
frequent process; it changes conceptual myths as more and more
advancement are made. Although we are living in technological era
but in developing countries like Pakistan disasters are shaped by
socially constructed roles. The need is to understand the most
vulnerable group of society i.e. females; their issues are complex in
nature because of undermined gender status in the society. There is a
need to identify maximum issues regarding females and to enhance
the achievement of millennium development goals (MDGs). Gender
issues are of great concern all around the globe including Pakistan.
Here female visibility in society is low, and also during disasters, the
failure to understand the reality that concentrates on double burden
including productive and reproductive care. Women have to
contribute a lot in society so we need to make them more disaster
resilient. For this non-structural measures like awareness, trainings
and education must be carried out. In rural and in urban settings in
any disaster like earthquake or flood, elements like gender
perspective, their age, physical health, demographic issues contribute
towards vulnerability. In Pakistan the gender issues in disasters were
of less concern before 2005 earthquake and 2010 floods. Significant
achievements are made after 2010 floods when gender and child cell
was created to provide all facilities to women and girls. The aim of
the study is to highlight all necessary facilities in a disaster to build
coping mechanism in females from basic rights till advance level
including education.
Abstract: The objective of this research is to investigate the
advantages of using large-diameter 0.7 inch prestressing strands in
pretention applications. The advantages of large-diameter strands are
mainly beneficial in the heavy construction applications. Bridges and
tunnels are subjected to a higher daily traffic with an exponential
increase in trucks ultimate weight, which raise the demand for higher
structural capacity of bridges and tunnels. In this research, precast
prestressed I-girders were considered as a case study. Flexure
capacities of girders fabricated using 0.7 inch strands and different
concrete strengths were calculated and compared to capacities of 0.6
inch strands girders fabricated using equivalent concrete strength.
The effect of bridge deck concrete strength on composite deck-girder
section capacity was investigated due to its possible effect on final
section capacity. Finally, a comparison was made to compare the
bridge cross-section of girders designed using regular 0.6 inch strands
and the large-diameter 0.7 inch. The research findings showed that
structural advantages of 0.7 inch strands allow for using fewer bridge
girders, reduced material quantity, and light-weight members. The
structural advantages of 0.7 inch strands are maximized when high
strength concrete (HSC) are used in girder fabrication, and concrete
of minimum 5ksi compressive strength is used in pouring bridge
decks. The use of 0.7 inch strands in bridge industry can partially
contribute to the improvement of bridge conditions, minimize
construction cost, and reduce the construction duration of the project.
Abstract: Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.
Abstract: Automatic Vehicle Identification (AVI) has many
applications in traffic systems (highway electronic toll collection, red
light violation enforcement, border and customs checkpoints, etc.).
License Plate Recognition is an effective form of AVI systems. In
this study, a smart and simple algorithm is presented for vehicle-s
license plate recognition system. The proposed algorithm consists of
three major parts: Extraction of plate region, segmentation of
characters and recognition of plate characters. For extracting the
plate region, edge detection algorithms and smearing algorithms are
used. In segmentation part, smearing algorithms, filtering and some
morphological algorithms are used. And finally statistical based
template matching is used for recognition of plate characters. The
performance of the proposed algorithm has been tested on real
images. Based on the experimental results, we noted that our
algorithm shows superior performance in car license plate
recognition.
Abstract: We propose a reduced-ordermodel for the instantaneous
hydrodynamic force on a cylinder. The model consists of a system of
two ordinary differential equations (ODEs), which can be integrated
in time to yield very accurate histories of the resultant force and
its direction. In contrast to several existing models, the proposed
model considers the actual (total) hydrodynamic force rather than its
perpendicular or parallel projection (the lift and drag), and captures
the complete force rather than the oscillatory part only. We study
and provide descriptions of the relationship between the model
parameters, evaluated utilizing results from numerical simulations,
and the Reynolds number so that the model can be used at any
arbitrary value within the considered range of 100 to 500 to provide
accurate representation of the force without the need to perform timeconsuming
simulations and solving the partial differential equations
(PDEs) governing the flow field.
Abstract: Salinity is a measure of the amount of salts in the
water. Total Dissolved Solids (TDS) as salinity parameter are often
determined using laborious and time consuming laboratory tests, but
it may be more appropriate and economical to develop a method
which uses a more simple soil salinity index. Because dissolved ions
increase salinity as well as conductivity, the two measures are
related. The aim of this research was determine of constant
coefficients for predicting of Total Dissolved Solids (TDS) based on
Electrical Conductivity (EC) with Statistics of Correlation
coefficient, Root mean square error, Maximum error, Mean Bias
error, Mean absolute error, Relative error and Coefficient of residual
mass. For this purpose, two experimental areas (S1, S2) of Khuzestan
province-IRAN were selected and four treatments with three
replications by series of double rings were applied. The treatments
were included 25cm, 50cm, 75cm and 100cm water application. The
results showed the values 16.3 & 12.4 were the best constant
coefficients for predicting of Total Dissolved Solids (TDS) based on
EC in Pilot S1 and S2 with correlation coefficient 0.977 & 0.997 and
191.1 & 106.1 Root mean square errors (RMSE) respectively.
Abstract: Nanotechnology is the science of creating, using and
manipulating objects which have at least one dimension in range of
0.1 to 100 nanometers. In other words, nanotechnology is
reconstructing a substance using its individual atoms and arranging
them in a way that is desirable for our purpose.
The main reason that nanotechnology has been attracting
attentions is the unique properties that objects show when they are
formed at nano-scale. These differing characteristics that nano-scale
materials show compared to their nature-existing form is both useful
in creating high quality products and dangerous when being in
contact with body or spread in environment.
In order to control and lower the risk of such nano-scale particles,
the main following three topics should be considered:
1) First of all, these materials would cause long term diseases that
may show their effects on body years after being penetrated in human
organs and since this science has become recently developed in
industrial scale not enough information is available about their
hazards on body.
2) The second is that these particles can easily spread out in
environment and remain in air, soil or water for very long time,
besides their high ability to penetrate body skin and causing new
kinds of diseases.
3) The third one is that to protect body and environment against
the danger of these particles, the protective barriers must be finer than
these small objects and such defenses are hard to accomplish.
This paper will review, discuss and assess the risks that human and
environment face as this new science develops at a high rate.
Abstract: Assessment of IEP (Individual Education Plan) is an
important stage in the area of special education. This paper deals
with this problem by introducing computer software which process
the data gathered from application of IEP. The software is intended
to be used by special education institution in Turkey and allows
assessment of school and family trainings. The software has a user
friendly interface and its design includes graphical developer tools.