Abstract: Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: Computers are increasingly being used as educational
tools in elementary/primary schools worldwide. A specific
application of such computer use, is that of multimedia games, where
the aim is to combine pedagogy and entertainment. This study
reports on a case-study whereby an educational multimedia game has
been developed for use by elementary school children. The stages of
the application-s design, implementation and evaluation are
presented. Strengths of the game are identified and discussed, and its
weaknesses are identified, allowing for suggestions for future redesigns.
The results show that the use of games can engage children
in the learning process for longer periods of time with the added
benefit of the entertainment factor.
Abstract: The present paper develops and validates a numerical procedure for the calculation of turbulent combustive flow in converging and diverging ducts and throuh simulation of the heat transfer processes, the amount of production and spread of Nox pollutant has been measured. A marching integration solution procedure employing the TDMA is used to solve the discretized equations. The turbulence model is the Prandtl Mixing Length method. Modeling the combustion process is done by the use of Arrhenius and Eddy Dissipation method. Thermal mechanism has been utilized for modeling the process of forming the nitrogen oxides. Finite difference method and Genmix numerical code are used for numerical solution of equations. Our results indicate the important influence of the limiting diverging angle of diffuser on the coefficient of recovering of pressure. Moreover, due to the intense dependence of Nox pollutant to the maximum temperature in the domain with this feature, the Nox pollutant amount is also in maximum level.
Abstract: This paper presents comparative emission study of
newly introduced gasoline/LPG bifuel automotive engine in Indian
market. Emissions were tested as per LPG-Bharat stage III driving
cycle. Emission tests were carried out for urban cycle and extra urban
cycle. Total time for urban and extra urban cycle was 1180 sec.
Engine was run in LPG mode by using conversion system. Emissions
were tested as per standard procedure and were compared. Corrected
emissions were computed by deducting ambient reading from sample
reading. Paper describes detail emission test procedure and results
obtained. CO emissions were in the range of38.9 to 111.3 ppm. HC
emissions were in the range of 18.2 to 62.6 ppm. Nox emissions were
08 to 3.9 ppm and CO2 emissions were from 6719.2 to 8051 ppm.
Paper throws light on emission results of LPG vehicles recently
introduced in Indian automobile market. Objectives of this
experimental study were to measure emissions of engines in gasoline
& LPG mode and compare them.
Abstract: This paper argues that a product development exercise
involves in addition to the conventional stages, several decisions
regarding other aspects. These aspects should be addressed
simultaneously in order to develop a product that responds to the
customer needs and that helps realize objectives of the stakeholders
in terms of profitability, market share and the like. We present a
framework that encompasses these different development
dimensions. The framework shows that a product development
methodology such as the Quality Function Deployment (QFD) is the
basic tool which allows definition of the target specifications of a
new product. Creativity is the first dimension that enables the
development exercise to live and end successfully. A number of
group processes need to be followed by the development team in
order to ensure enough creativity and innovation. Secondly,
packaging is considered to be an important extension of the product.
Branding strategies, quality and standardization requirements,
identification technologies, design technologies, production
technologies and costing and pricing are also integral parts to the
development exercise. These dimensions constitute the proposed
framework. The paper also presents a mathematical model used to
calculate the design targets based on the target costing principle. The
framework is used to study a case of a new product development in
the telecommunications services sector.
Abstract: Organizational structure of the Turkish state
universities is a form of bureaucracy, a high efficient system in
rational and formal control. According to the dimensional approach
bureaucracy can occur in an organization in a degree, as some
bureaucracy characteristics can be stronger than others. In addition,
the units of an organization due to their different specific
characteristic properties can perceive the bureaucracy differently. In
the study, Hall-s Organizational Inventory, which was developed for
evaluating the degree of bureaucratization from the dimensional
perspective, is used to find out if there is a difference in the
perception of the bureaucracy between the academicians working in
three different departments and two faculties in the same university.
Abstract: This paper discusses a new, systematic approach to
the synthesis of a NP-hard class of non-regenerative Boolean
networks, described by FON[FOFF]={mi}[{Mi}], where for every
mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such
that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where
'n' represents the number of distinct primary inputs). The method
automatically ensures exact minimization for certain important selfdual
functions with 2n-1 points in its one-set. The elements meant for
grouping are determined from a newly proposed weighted incidence
matrix. Then the binary value corresponding to the candidate pair is
correlated with the proposed binary value matrix to enable direct
synthesis. We recommend algebraic factorization operations as a post
processing step to enable reduction in literal count. The algorithm
can be implemented in any high level language and achieves best
cost optimization for the problem dealt with, irrespective of the
number of inputs. For other cases, the method is iterated to
subsequently reduce it to a problem of O(n-1), O(n-2),.... and then
solved. In addition, it leads to optimal results for problems exhibiting
higher degree of adjacency, with a different interpretation of the
heuristic, and the results are comparable with other methods.
In terms of literal cost, at the technology independent stage, the
circuits synthesized using our algorithm enabled net savings over
AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of-
Products or ESOP forms) and AND-OR-EXOR logic by 45.57%,
41.78% and 41.78% respectively for the various problems.
Circuit level simulations were performed for a wide variety of
case studies at 3.3V and 2.5V supply to validate the performance of
the proposed method and the quality of the resulting synthesized
circuits at two different voltage corners. Power estimation was
carried out for a 0.35micron TSMC CMOS process technology. In
comparison with AOI logic, the proposed method enabled mean
savings in power by 42.46%. With respect to AND-EXOR logic, the
proposed method yielded power savings to the tune of 31.88%, while
in comparison with AND-OR-EXOR level networks; average power
savings of 33.23% was obtained.
Abstract: In the present research, a finite element model is
presented to study the geometrical and material nonlinear behavior of
reinforced concrete plane frames considering soil-structure
interaction. The nonlinear behaviors of concrete and reinforcing steel
are considered both in compression and tension up to failure. The
model takes account also for the number, diameter, and distribution
of rebar along every cross section. Soil behavior is taken into
consideration using four different models; namely: linear-, nonlinear
Winkler's model, and linear-, nonlinear continuum model. A
computer program (NARC) is specially developed in order to
perform the analysis. The results achieved by the present model show
good agreement with both theoretical and experimental published
literature. The nonlinear behavior of a rectangular frame resting on
soft soil up to failure using the proposed model is introduced for
demonstration.
Abstract: From past many decades human beings are suffering
from plethora of natural disasters. Occurrence of disasters is a
frequent process; it changes conceptual myths as more and more
advancement are made. Although we are living in technological era
but in developing countries like Pakistan disasters are shaped by
socially constructed roles. The need is to understand the most
vulnerable group of society i.e. females; their issues are complex in
nature because of undermined gender status in the society. There is a
need to identify maximum issues regarding females and to enhance
the achievement of millennium development goals (MDGs). Gender
issues are of great concern all around the globe including Pakistan.
Here female visibility in society is low, and also during disasters, the
failure to understand the reality that concentrates on double burden
including productive and reproductive care. Women have to
contribute a lot in society so we need to make them more disaster
resilient. For this non-structural measures like awareness, trainings
and education must be carried out. In rural and in urban settings in
any disaster like earthquake or flood, elements like gender
perspective, their age, physical health, demographic issues contribute
towards vulnerability. In Pakistan the gender issues in disasters were
of less concern before 2005 earthquake and 2010 floods. Significant
achievements are made after 2010 floods when gender and child cell
was created to provide all facilities to women and girls. The aim of
the study is to highlight all necessary facilities in a disaster to build
coping mechanism in females from basic rights till advance level
including education.
Abstract: The objective of this research is to investigate the
advantages of using large-diameter 0.7 inch prestressing strands in
pretention applications. The advantages of large-diameter strands are
mainly beneficial in the heavy construction applications. Bridges and
tunnels are subjected to a higher daily traffic with an exponential
increase in trucks ultimate weight, which raise the demand for higher
structural capacity of bridges and tunnels. In this research, precast
prestressed I-girders were considered as a case study. Flexure
capacities of girders fabricated using 0.7 inch strands and different
concrete strengths were calculated and compared to capacities of 0.6
inch strands girders fabricated using equivalent concrete strength.
The effect of bridge deck concrete strength on composite deck-girder
section capacity was investigated due to its possible effect on final
section capacity. Finally, a comparison was made to compare the
bridge cross-section of girders designed using regular 0.6 inch strands
and the large-diameter 0.7 inch. The research findings showed that
structural advantages of 0.7 inch strands allow for using fewer bridge
girders, reduced material quantity, and light-weight members. The
structural advantages of 0.7 inch strands are maximized when high
strength concrete (HSC) are used in girder fabrication, and concrete
of minimum 5ksi compressive strength is used in pouring bridge
decks. The use of 0.7 inch strands in bridge industry can partially
contribute to the improvement of bridge conditions, minimize
construction cost, and reduce the construction duration of the project.
Abstract: We propose a reduced-ordermodel for the instantaneous
hydrodynamic force on a cylinder. The model consists of a system of
two ordinary differential equations (ODEs), which can be integrated
in time to yield very accurate histories of the resultant force and
its direction. In contrast to several existing models, the proposed
model considers the actual (total) hydrodynamic force rather than its
perpendicular or parallel projection (the lift and drag), and captures
the complete force rather than the oscillatory part only. We study
and provide descriptions of the relationship between the model
parameters, evaluated utilizing results from numerical simulations,
and the Reynolds number so that the model can be used at any
arbitrary value within the considered range of 100 to 500 to provide
accurate representation of the force without the need to perform timeconsuming
simulations and solving the partial differential equations
(PDEs) governing the flow field.
Abstract: Salinity is a measure of the amount of salts in the
water. Total Dissolved Solids (TDS) as salinity parameter are often
determined using laborious and time consuming laboratory tests, but
it may be more appropriate and economical to develop a method
which uses a more simple soil salinity index. Because dissolved ions
increase salinity as well as conductivity, the two measures are
related. The aim of this research was determine of constant
coefficients for predicting of Total Dissolved Solids (TDS) based on
Electrical Conductivity (EC) with Statistics of Correlation
coefficient, Root mean square error, Maximum error, Mean Bias
error, Mean absolute error, Relative error and Coefficient of residual
mass. For this purpose, two experimental areas (S1, S2) of Khuzestan
province-IRAN were selected and four treatments with three
replications by series of double rings were applied. The treatments
were included 25cm, 50cm, 75cm and 100cm water application. The
results showed the values 16.3 & 12.4 were the best constant
coefficients for predicting of Total Dissolved Solids (TDS) based on
EC in Pilot S1 and S2 with correlation coefficient 0.977 & 0.997 and
191.1 & 106.1 Root mean square errors (RMSE) respectively.
Abstract: We report the results of an lattice Boltzmann
simulation of magnetohydrodynamic damping of sidewall convection
in a rectangular enclosure filled with a porous medium. In particular
we investigate the suppression of convection when a steady magnetic
field is applied in the vertical direction. The left and right vertical
walls of the cavity are kept at constant but different temperatures
while both the top and bottom horizontal walls are insulated. The
effects of the controlling parameters involved in the heat transfer and
hydrodynamic characteristics are studied in detail. The heat and mass
transfer mechanisms and the flow characteristics inside the enclosure
depended strongly on the strength of the magnetic field and Darcy
number. The average Nusselt number decreases with rising values of
the Hartmann number while this increases with increasing values of
the Darcy number.
Abstract: Nanotechnology is the science of creating, using and
manipulating objects which have at least one dimension in range of
0.1 to 100 nanometers. In other words, nanotechnology is
reconstructing a substance using its individual atoms and arranging
them in a way that is desirable for our purpose.
The main reason that nanotechnology has been attracting
attentions is the unique properties that objects show when they are
formed at nano-scale. These differing characteristics that nano-scale
materials show compared to their nature-existing form is both useful
in creating high quality products and dangerous when being in
contact with body or spread in environment.
In order to control and lower the risk of such nano-scale particles,
the main following three topics should be considered:
1) First of all, these materials would cause long term diseases that
may show their effects on body years after being penetrated in human
organs and since this science has become recently developed in
industrial scale not enough information is available about their
hazards on body.
2) The second is that these particles can easily spread out in
environment and remain in air, soil or water for very long time,
besides their high ability to penetrate body skin and causing new
kinds of diseases.
3) The third one is that to protect body and environment against
the danger of these particles, the protective barriers must be finer than
these small objects and such defenses are hard to accomplish.
This paper will review, discuss and assess the risks that human and
environment face as this new science develops at a high rate.
Abstract: Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.
Abstract: Trust is essential for further and wider acceptance of
contemporary e-services. It was first addressed almost thirty years
ago in Trusted Computer System Evaluation Criteria standard by
the US DoD. But this and other proposed approaches of that
period were actually solving security. Roughly some ten years ago,
methodologies followed that addressed trust phenomenon at its core,
and they were based on Bayesian statistics and its derivatives, while
some approaches were based on game theory. However, trust is a
manifestation of judgment and reasoning processes. It has to be dealt
with in accordance with this fact and adequately supported in cyber
environment. On the basis of the results in the field of psychology
and our own findings, a methodology called qualitative algebra has
been developed, which deals with so far overlooked elements of trust
phenomenon. It complements existing methodologies and provides a
basis for a practical technical solution that supports management of
trust in contemporary computing environments. Such solution is also
presented at the end of this paper.
Abstract: Several methods have been proposed for color image
compression but the reconstructed image had very low signal to noise
ratio which made it inefficient. This paper describes a lossy
compression technique for color images which overcomes the
drawbacks. The technique works on spatial domain where the pixel
values of RGB planes of the input color image is mapped onto two
dimensional planes. The proposed technique produced better results
than JPEG2000, 2DPCA and a comparative study is reported based
on the image quality measures such as PSNR and MSE.Experiments
on real time images are shown that compare this methodology with
previous ones and demonstrate its advantages.
Abstract: The dynamics of Min proteins plays a center role in
accurate cell division. Although the nucleoids may presumably play
an important role in prokaryotic cell division, there is a lack of
models to account for its participation. In this work, we apply the
lattice Boltzmann method to investigate protein oscillation based on a
mesoscopic model that takes into account the nucleoid-s role. We
found that our numerical results are in reasonably good agreement
with the previous experimental results On comparing with the other
computational models without the presence of nucleoids, the
highlight of our finding is that the local densities of MinD and MinE
on the cytoplasmic membrane increases, especially along the cell
width, when the size of the obstacle increases, leading to a more
distinct cap-like structure at the poles. This feature indicated the
realistic pattern and reflected the combination of Min protein
dynamics and nucleoid-s role.
Abstract: Renewable water resources are crucial production
variables in arid and semi-arid regions where intensive agriculture is
practiced to meet ever-increasing demand for food and fiber. This is
crucial for the Dez and Moghan command areas where water delivery
problems and adverse environmental issues are widespread. This
paper aims to identify major problems areas using on-farm surveys of
200 farmers, agricultural extensionists and water suppliers which was
complemented by secondary data and field observations during 2010-
2011 cultivating season. The SPSS package was used to analyze and
synthesis data. Results indicated inappropriate canal operations in
both schemes, though there was no unanimity about the underlying
causes. Inequitable and inflexible distribution was found to be rooted
in deficient hydraulic structures particularly in the main and
secondary canals. The inadequacy and inflexibility of water
scheduling regime was the underlying causes of recurring pest and
disease spread which often led to the decline of crop yield and
quality, although these were not disputed, the water suppliers were
not prepared to link with the deficiencies in the operation of the main
and secondary canals. They rather attributed these to the prevailing
salinity; alkalinity, water table fluctuations and leaching of the
valuable agro-chemical inputs from the plants- route zone with farreaching
consequences. Examples of these include the pollution of
ground and surface resources due to over-irrigation at the farm level
which falls under the growers- own responsibility. Poor irrigation
efficiency and adverse environmental problems were attributed to
deficient and outdated farming practices that were in turn rooted in
poor extension programs and irrational water charges.