Abstract: The study identified the sources of production
inefficiency of the farming sector in district Faisalabad in the Punjab
province of Pakistan. Data Envelopment Analysis (DEA) technique
was utilized at farm level survey data of 300 farmers for the year
2009. The overall mean efficiency score was 0.78 indicating 22
percent inefficiency of the sample farmers. Computed efficiency
scores were then regressed on farm specific variables using Tobit
regression analysis. Farming experience, education, access to
farming credit, herd size and number of cultivation practices showed
constructive and significant effect on the farmer-s technical
efficiency.
Abstract: Designing, implementing, and debugging concurrency
control algorithms in a real system is a complex, tedious, and errorprone
process. Further, understanding concurrency control
algorithms and distributed computations is itself a difficult task.
Visualization can help with both of these problems. Thus, we have
developed an exploratory environment in which people can prototype
and test various versions of concurrency control algorithms, study
and debug distributed computations, and view performance statistics
of distributed systems. In this paper, we describe the exploratory
environment and show how it can be used to explore concurrency
control algorithms for the interactive steering of distributed
computations.
Abstract: Since 2008 a new economic crisis is present is the
entire planet. This crisis affects several domains of the economic but
also of the social life. Consumption decreases due to the lack of
necessary resources of households to increase their expenditures. The
car manufacturing is one of the main industrial activities in European
Union (EU) and the present crisis particularly affects it. The present
study examines the correlations between several socio-economic
indicators and car market in European Union. The target is to find out
the impact of the present economic crisis on the car market in EU.
Abstract: Evolutionary Programming (EP) represents a
methodology of Evolutionary Algorithms (EA) in which mutation is
considered as a main reproduction operator. This paper presents a
novel EP approach for Artificial Neural Networks (ANN) learning.
The proposed strategy consists of two components: the self-adaptive,
which contains phenotype information and the dynamic, which is
described by genotype. Self-adaptation is achieved by the addition of
a value, called the network weight, which depends on a total number
of hidden layers and an average number of neurons in hidden layers.
The dynamic component changes its value depending on the fitness
of a chromosome, exposed to mutation. Thus, the mutation step size
is controlled by two components, encapsulated in the algorithm,
which adjust it according to the characteristics of a predefined ANN
architecture and the fitness of a particular chromosome. The
comparative analysis of the proposed approach and the classical EP
(Gaussian mutation) showed, that that the significant acceleration of
the evolution process is achieved by using both phenotype and
genotype information in the mutation strategy.
Abstract: The ability to distinguish missense nucleotide
substitutions that contribute to harmful effect from those that do not
is a difficult problem usually accomplished through functional in
vivo analyses. In this study, instead current biochemical methods, the
effects of missense mutations upon protein structure and function
were assayed by means of computational methods and information
from the databases. For this order, the effects of new missense
mutations in exon 5 of PTEN gene upon protein structure and
function were examined. The gene coding for PTEN was identified
and localized on chromosome region 10q23.3 as the tumor
suppressor gene. The utilization of these methods were shown that
c.319G>A and c.341T>G missense mutations that were recognized in
patients with breast cancer and Cowden disease, could be pathogenic.
This method could be use for analysis of missense mutation in others
genes.
Abstract: The concentrations of aliphatic and polycyclic aromatic hydrocarbons (PAH) were determined in atmospheric aerosol samples collected at a rural site in Hungary (K-puszta, summer 2008), a boreal forest (Hyytiälä,
April 2007) and a polluted rural area in Italy (San Pietro Capofiume, Po Valley, April 2008). A clear distinction between “clean" and “polluted" periods was observed. Concentrations obtained for Hyytiälä are significantly lower than those for the other two sites. Source reconciliation was performed using diagnostic parameters, such as the carbon preference index and ratios between PAH. The presence of an unresolved complex mixture of hydrocarbons, especially for the Finnish and Italian samples, is indicative of petrogenic inputs. In K-puszta, the aliphatic hydrocarbons are dominated by leaf wax n-alkanes. The long range transport of anthropogenic pollution contributed to the Finnish aerosol. Industrial activities and vehicular emissions represent major sources in San Pietro Capofiume. PAH in K-puszta consist of both pyrogenic and petrogenic compounds.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: This paper presents an application of 5S lean technology to a production facility. Due to increased demand, high product variety, and a push production system, the plant has suffered from excessive wastes, unorganized workstations, and unhealthy work environment. This has translated into increased production cost, frequent delays, and low workers morale. Under such conditions, it has become difficult, if not impossible, to implement effective continuous improvement studies. Hence, the lean project is aimed at diagnosing the production process, streamlining the workflow, removing/reducing process waste, cleaning the production environment, improving plant layout, and organizing workstations. 5S lean technology is utilized for achieving project objectives. The work was a combination of both culture changes and tangible/physical changes on the shop floor. The project has drastically changed the plant and developed the infrastructure for a successful implementation of continuous improvement as well as other best practices and quality initiatives.
Abstract: Due to the environmental and price issues of current
energy crisis, scientists and technologists around the globe are
intensively searching for new environmentally less-impact form of
clean energy that will reduce the high dependency on fossil fuel.
Particularly hydrogen can be produced from biomass via thermochemical
processes including pyrolysis and gasification due to the
economic advantage and can be further enhanced through in-situ
carbon dioxide removal using calcium oxide. This work focuses on
the synthesis and development of the flowsheet for the enhanced
biomass gasification process in PETRONAS-s iCON process
simulation software. This hydrogen prediction model is conducted at
operating temperature between 600 to 1000oC at atmospheric
pressure. Effects of temperature, steam-to-biomass ratio and
adsorbent-to-biomass ratio were studied and 0.85 mol fraction of
hydrogen is predicted in the product gas. Comparisons of the results
are also made with experimental data from literature. The
preliminary economic potential of developed system is RM 12.57 x
106 which equivalent to USD 3.77 x 106 annually shows economic
viability of this process.
Abstract: This paper illustrates the use of a combined neural
network model for classification of electrocardiogram (ECG) beats.
We present a trainable neural network ensemble approach to develop
customized electrocardiogram beat classifier in an effort to further
improve the performance of ECG processing and to offer
individualized health care.
We process a three stage technique for detection of premature
ventricular contraction (PVC) from normal beats and other heart
diseases. This method includes a denoising, a feature extraction and a
classification. At first we investigate the application of stationary
wavelet transform (SWT) for noise reduction of the
electrocardiogram (ECG) signals. Then feature extraction module
extracts 10 ECG morphological features and one timing interval
feature. Then a number of multilayer perceptrons (MLPs) neural
networks with different topologies are designed.
The performance of the different combination methods as well as
the efficiency of the whole system is presented. Among them,
Stacked Generalization as a proposed trainable combined neural
network model possesses the highest recognition rate of around 95%.
Therefore, this network proves to be a suitable candidate in ECG
signal diagnosis systems. ECG samples attributing to the different
ECG beat types were extracted from the MIT-BIH arrhythmia
database for the study.
Abstract: This work concerns the topological optimization
problem for determining the optimal petroleum refinery
configuration. We are interested in further investigating and
hopefully advancing the existing optimization approaches and
strategies employing logic propositions to conceptual process
synthesis problems. In particular, we seek to contribute to this
increasingly exciting area of chemical process modeling by
addressing the following potentially important issues: (a) how the
formulation of design specifications in a mixed-logical-and-integer
optimization model can be employed in a synthesis problem to enrich
the problem representation by incorporating past design experience,
engineering knowledge, and heuristics; and (b) how structural
specifications on the interconnectivity relationships by space (states)
and by function (tasks) in a superstructure should be properly
formulated within a mixed-integer linear programming (MILP)
model. The proposed modeling technique is illustrated on a case
study involving the alternative processing routes of naphtha, in which
significant improvement in the solution quality is obtained.
Abstract: Unlike its conventional counterpart, Islamic principles
forbid Islamic banks to take any interest-related income and thus
makes deposits from depositors as an important source of fund for its
operational and financing. Consequently, the risk of deposit
withdrawal by depositors is an important aspect that should be wellmanaged
in Islamic banking. This paper aims to investigate factors
that influence depositors- withdrawal behavior in Islamic banks,
particularly in Malaysia, using the framework of theory of reasoned
action. A total of 368 respondents from Klang valley are involved in
the analysis. The paper finds that all the constructs variable i.e.
normative beliefs, subjective norms, behavioral beliefs, and attitude
towards behavior are perceived to be distinct by the respondents. In
addition, the structural equation model is able to verify the structural
relationships between subjective norms, attitude towards behavior
and behavioral intention. Subjective norms gives more influence to
depositors- decision on deposit withdrawal compared to attitude
towards behavior.
Abstract: In This Article We establish moment inequality of
dependent random variables,furthermore some theorems of strong law
of large numbers and complete convergence for sequences of dependent
random variables. In particular, independent and identically
distributed Marcinkiewicz Law of large numbers are generalized to
the case of m0-dependent sequences.
Abstract: A virtual collaborative classroom was created at East Carolina University, using videoconference technology via regular internet to bring students from 18 different countries, 2 at a time, to the ECU classroom in real time to learn about each other-s culture. Students from two countries are partnered one on one, they meet for 4-5 weeks, and submit a joint paper. Then the same process is repeated for two other countries. Lectures and student discussions are managed with pre-determined topics and questions. Classes are conducted in English and reading assignments are placed on the website. Administratively all partners are independent, students pay fees and get credits at their home institution. Familiarity with technology, knowledge in cultural understanding and attitude change were assessed, only attitude changes are reported in this paper. After taking this course, all students stated their comfort level in working with, and their desire to interact with, culturally different others grew stronger and their xenophobia and isolationist attitudes decreased.
Abstract: The purpose of this paper is to consider the
introduction of online courses to replace the current classroom-based
staff training. The current training is practical, and must be
completed before access to the financial computer system is
authorized. The long term objective is to measure the efficacy,
effectiveness and efficiency of the training, and to establish whether
a transfer of knowledge back to the workplace has occurred. This
paper begins with an overview explaining the importance of staff
training in an evolving, competitive business environment and
defines the problem facing this particular organization. A summary
of the literature review is followed by a brief discussion of the
research methodology and objective. The implementation of the
alpha version of the online course is then described. This paper may
be of interest to those seeking insights into, or new theory regarding,
practical interventions of online learning in the real world.
Abstract: Coarse and fine particulate matter were collected at a
residential area at Vashi, Navi Mumbai and the filter samples were
analysed for trace elements using PIXE technique. The trend of
particulate matter showed higher concentrations during winter than
the summer and monsoon concentration levels. High concentrations
of elements related to soil and sea salt were found in PM10 and
PM2.5. Also high levels of zinc and sulphur found in the particulates
of both the size fractions. EF analysis showed enrichment of Cu, Cr
and Mn only in the fine fraction suggesting their origin from
anthropogenic sources. The EF value was observed to be maximum
for As, Pb and Zn in the fine particulates. However, crustal derived
elements showed very low EF values indicating their origin from
soil. The PCA based multivariate studies identified soil, sea salt,
combustion and Se sources as common sources for coarse and
additionally an industrial source has also been identified for fine
particles.
Abstract: Most electrical distribution systems are incurring large
losses as the loads are wide spread, inadequate reactive power
compensation facilities and their improper control. A typical static
VAR compensator consists of capacitor bank in binary sequential
steps operated in conjunction with a thyristor controlled reactor of the
smallest step size. This SVC facilitates stepless control of reactive
power closely matching with load requirements so as to maintain
power factor nearer to unity. This type of SVC-s requiring a
appropriately controlled TCR. This paper deals with an air cored
reactor suitable for distribution transformer of 3phase, 50Hz, Dy11,
11KV/433V, 125 KVA capacity. Air cored reactors are designed,
built, tested and operated in conjunction with capacitor bank in five
binary sequential steps. It is established how the delta connected TCR
minimizes the harmonic components and the operating range for
various electrical quantities as a function of firing angle is
investigated. In particular firing angle v/s line & phase currents, D.C.
components, THD-s, active and reactive powers, odd and even triplen
harmonics, dominant characteristic harmonics are all investigated and
range of firing angle is fixed for satisfactory operation. The harmonic
spectra for phase and line quantities at specified firing angles are
given. In case the TCR is operated within the bound specified in this
paper established through simulation studies are yielding the best
possible operating condition particularly free from all dominant
harmonics.
Abstract: The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.
Abstract: Testing accounts for the major percentage of technical
contribution in the software development process. Typically, it
consumes more than 50 percent of the total cost of developing a
piece of software. The selection of software tests is a very important
activity within this process to ensure the software reliability
requirements are met. Generally tests are run to achieve maximum
coverage of the software code and very little attention is given to the
achieved reliability of the software. Using an existing methodology,
this paper describes how to use Bayesian Belief Networks (BBNs) to
select unit tests based on their contribution to the reliability of the
module under consideration. In particular the work examines how the
approach can enhance test-first development by assessing the quality
of test suites resulting from this development methodology and
providing insight into additional tests that can significantly reduce
the achieved reliability. In this way the method can produce an
optimal selection of inputs and the order in which the tests are
executed to maximize the software reliability. To illustrate this
approach, a belief network is constructed for a modern software
system incorporating the expert opinion, expressed through
probabilities of the relative quality of the elements of the software,
and the potential effectiveness of the software tests. The steps
involved in constructing the Bayesian Network are explained as is a
method to allow for the test suite resulting from test-driven
development.
Abstract: Software developed for a specific customer under contract
typically undergoes a period of testing by the customer before
acceptance. This is known as user acceptance testing and the process
can reveal both defects in the system and requests for changes to
the product. This paper uses nonhomogeneous Poisson processes to
model a real user acceptance data set from a recently developed
system. In particular a split Poisson process is shown to provide an
excellent fit to the data. The paper explains how this model can be
used to aid the allocation of resources through the accurate prediction
of occurrences both during the acceptance testing phase and before
this activity begins.