Abstract: The study identified the sources of production
inefficiency of the farming sector in district Faisalabad in the Punjab
province of Pakistan. Data Envelopment Analysis (DEA) technique
was utilized at farm level survey data of 300 farmers for the year
2009. The overall mean efficiency score was 0.78 indicating 22
percent inefficiency of the sample farmers. Computed efficiency
scores were then regressed on farm specific variables using Tobit
regression analysis. Farming experience, education, access to
farming credit, herd size and number of cultivation practices showed
constructive and significant effect on the farmer-s technical
efficiency.
Abstract: In this paper, we propose a geometric modeling of
illumination on the patterned image containing etching transistor. This
image is captured by a commercial camera during the inspection of
a TFT-LCD panel. Inspection of defect is an important process in the
production of LCD panel, but the regional difference in brightness,
which has a negative effect on the inspection, is due to the uneven
illumination environment. In order to solve this problem, we present
a geometric modeling of illumination consisting of an interpolation
using the least squares method and 3D modeling using bezier surface.
Our computational time, by using the sampling method, is shorter
than the previous methods. Moreover, it can be further used to correct
brightness in every patterned image.
Abstract: Designing, implementing, and debugging concurrency
control algorithms in a real system is a complex, tedious, and errorprone
process. Further, understanding concurrency control
algorithms and distributed computations is itself a difficult task.
Visualization can help with both of these problems. Thus, we have
developed an exploratory environment in which people can prototype
and test various versions of concurrency control algorithms, study
and debug distributed computations, and view performance statistics
of distributed systems. In this paper, we describe the exploratory
environment and show how it can be used to explore concurrency
control algorithms for the interactive steering of distributed
computations.
Abstract: Drinking water is one of the most valuable resources
available to mankind. The presence of pathogens in drinking water is
highly undesirable. Because of the Lateritic soil, the iron
concentrations were high in ground water. High concentration of iron
and other trace elements could restrict bacterial growth and modify
their metabolic pattern as well. The bacterial growth rate reduced in
the presence of iron in water. This paper presents the results of a
controlled laboratory study conducted to assess the inhibition of
micro-organism (pathogen) in well waters in the presence of
dissolved iron concentrations. Synthetic samples were studied in the
laboratory and the results compared with field samples. Predictive
model for microbial inhibition in the presence of iron is presented. It
was seen that the bore wells, open wells and the field results varied,
probably due to the nature of micro-organism utilizing the iron in
well waters.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: The aim of this article is to assess the existing
business models used by the banks operating in the CEE countries in
the time period from 2006 till 2011.
In order to obtain research results, the authors performed
qualitative analysis of the scientific literature on bank business
models, which have been grouped into clusters that consist of such
components as: 1) capital and reserves; 2) assets; 3) deposits, and 4)
loans.
In their turn, bank business models have been developed based on
the types of core activities of the banks, and have been divided into
four groups: Wholesale, Investment, Retail and Universal Banks.
Descriptive statistics have been used to analyse the models,
determining mean, minimal and maximal values of constituent
cluster components, as well as standard deviation. The analysis of
the data is based on such bank variable indices as Return on Assets
(ROA) and Return on Equity (ROE).
Abstract: The extensive number of engineering drawing will be referred for planning process and the changes will produce a good engineering design to meet the demand in producing a new model. The advantage in reuse of engineering designs is to allow continuous product development to further improve the quality of product development, thus reduce the development costs. However, to retrieve the existing engineering drawing, it is time consuming, a complex process and are expose to errors. Engineering drawing file searching system will be proposed to solve this problem. It is essential for engineer and designer to have some sort of medium to enable them to search for drawing in the most effective way. This paper lays out the proposed research project under the area of information extraction in engineering drawing.
Abstract: A recent neurospiking coding scheme for feature extraction from biosonar echoes of various plants is examined with avariety of stochastic classifiers. Feature vectors derived are employedin well-known stochastic classifiers, including nearest-neighborhood,single Gaussian and a Gaussian mixture with EM optimization.Classifiers' performances are evaluated by using cross-validation and bootstrapping techniques. It is shown that the various classifers perform equivalently and that the modified preprocessing configuration yields considerably improved results.
Abstract: Uranium mining and processing in Brazil occur in a
northeastern area near to Caetité-BA. Several Non-Governmental
Organizations claim that uranium mining in this region is a pollutant
causing health risks to the local population,but those in charge of the
complex extraction and production of“yellow cake" for generating
fuel to the nuclear power plants reject these allegations. This study
aimed at identifying potential problems caused by mining to the
population of Caetité. In this, work,the concentrations of 238U, 232Th
and 40K radioisotopes in the teeth of the Caetité population were
determined by ICP-MS. Teeth are used as bioindicators of
incorporated radionuclides. Cumulative radiation doses in the
skeleton were also determined. The concentration values were below
0.008 ppm, and annual effective dose due to radioisotopes are below
to the reference values. Therefore, it is not possible to state that the
mining process in Caetité increases pollution or radiation exposure in
a meaningful way.
Abstract: This paper utilizes a finite element analysis to study
the bearing capacity of ring footings on a two-layered soil. The upper
layer, that the footing is placed on it, is soft clay and the underneath
layer is a cohesionless sand. For modeling soils, Mohr–Coulomb
plastic yield criterion is employed. The effects of two factors, the
clay layer thickness and the ratio of internal radius of the ring footing
to external radius of the ring, have been analyzed. It is found that the
bearing capacity decreases as the value of ri / ro increases.
Although, as the clay layer thickness increases the bearing capacity
was alleviated gradually.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Since 2008 a new economic crisis is present is the
entire planet. This crisis affects several domains of the economic but
also of the social life. Consumption decreases due to the lack of
necessary resources of households to increase their expenditures. The
car manufacturing is one of the main industrial activities in European
Union (EU) and the present crisis particularly affects it. The present
study examines the correlations between several socio-economic
indicators and car market in European Union. The target is to find out
the impact of the present economic crisis on the car market in EU.
Abstract: Data mining incorporates a group of statistical
methods used to analyze a set of information, or a data set. It operates
with models and algorithms, which are powerful tools with the great
potential. They can help people to understand the patterns in certain
chunk of information so it is obvious that the data mining tools have
a wide area of applications. For example in the theoretical chemistry
data mining tools can be used to predict moleculeproperties or
improve computer-assisted drug design. Classification analysis is one
of the major data mining methodologies. The aim of thecontribution
is to create a classification model, which would be able to deal with a
huge data set with high accuracy. For this purpose logistic regression,
Bayesian logistic regression and random forest models were built
using R software. TheBayesian logistic regression in Latent GOLD
software was created as well. These classification methods belong to
supervised learning methods.
It was necessary to reduce data matrix dimension before construct
models and thus the factor analysis (FA) was used. Those models
were applied to predict the biological activity of molecules, potential
new drug candidates.
Abstract: Evolutionary Programming (EP) represents a
methodology of Evolutionary Algorithms (EA) in which mutation is
considered as a main reproduction operator. This paper presents a
novel EP approach for Artificial Neural Networks (ANN) learning.
The proposed strategy consists of two components: the self-adaptive,
which contains phenotype information and the dynamic, which is
described by genotype. Self-adaptation is achieved by the addition of
a value, called the network weight, which depends on a total number
of hidden layers and an average number of neurons in hidden layers.
The dynamic component changes its value depending on the fitness
of a chromosome, exposed to mutation. Thus, the mutation step size
is controlled by two components, encapsulated in the algorithm,
which adjust it according to the characteristics of a predefined ANN
architecture and the fitness of a particular chromosome. The
comparative analysis of the proposed approach and the classical EP
(Gaussian mutation) showed, that that the significant acceleration of
the evolution process is achieved by using both phenotype and
genotype information in the mutation strategy.
Abstract: The ability to distinguish missense nucleotide
substitutions that contribute to harmful effect from those that do not
is a difficult problem usually accomplished through functional in
vivo analyses. In this study, instead current biochemical methods, the
effects of missense mutations upon protein structure and function
were assayed by means of computational methods and information
from the databases. For this order, the effects of new missense
mutations in exon 5 of PTEN gene upon protein structure and
function were examined. The gene coding for PTEN was identified
and localized on chromosome region 10q23.3 as the tumor
suppressor gene. The utilization of these methods were shown that
c.319G>A and c.341T>G missense mutations that were recognized in
patients with breast cancer and Cowden disease, could be pathogenic.
This method could be use for analysis of missense mutation in others
genes.
Abstract: This paper seeks to develop simple yet practical and
efficient control scheme that enables cooperating arms to handle a
flexible beam. Specifically the problem studied herein is that of two
arms rigidly grasping a flexible beam and such capable of generating
forces/moments in such away as to move a flexible beam along a
predefined trajectory. The paper develops a sliding mode control law
that provides robustness against model imperfection and uncertainty.
It also provides an implicit stability proof. Simulation results for two
three joint arms moving a flexible beam, are presented to validate the
theoretical results.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: This paper covers various aspects of the Internet film
piracy. In order to successfully deal with this matter, it is needed to
recognize and explain various motivational factors related to film
piracy. Thus, this study proposes groups of economical, sociopsychological
and other factors that could motivate individuals
to engage in pirate activities. The paper also studies the interactions
between downloaders and uploaders and offers the causality of the
motivational factors and its effects on the film industry.
Moreover, the study also focuses on proposed scheme of relations
of downloading movies and the possible effect on box office
revenues.
Abstract: Although oil-based drilling fluids are of paramount practical and economical interest, they represent a serious source of pollution, once released into the environment as drill cuttings. The aim of this study is to assess the capability of isolated microorganisms to degrade gasoil fuel. The commonly used physicochemical and biodegradation remediation techniques of petroleum contaminated soil were both investigated. The study revealed that natural biodegradation is favorable. Even though, the presence of heavy metals, the moisture level of (8.55%) and nutrient deficiencies put severe constrains on microorganisms- survival ranges inhibiting the biodegradation process. The selected strains were able to degrade the diesel fuel at significantly high rates (around 98%).
Abstract: Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.