Abstract: Instead of traditional (nominal) classification we investigate
the subject of ordinal classification or ranking. An enhanced
method based on an ensemble of Support Vector Machines (SVM-s)
is proposed. Each binary classifier is trained with specific weights
for each object in the training data set. Experiments on benchmark
datasets and synthetic data indicate that the performance of our
approach is comparable to state of the art kernel methods for
ordinal regression. The ensemble method, which is straightforward
to implement, provides a very good sensitivity-specificity trade-off
for the highest and lowest rank.
Abstract: The rapid pace of technological advancement and its
consequential widening digital divide has resulted in the
marginalization of the disabled especially the communication
challenged. The dearth of suitable technologies for the development
of assistive technologies has served to further marginalize the
communications challenged user population and widen this chasm
even further. Given the varying levels of disability there and its
associated requirement for customized solution based. This paper
explains the use of a Software Development Kits (SDK) for the
bridging of this communications divide through the use of industry
poplar communications SDKs towards identification of requirements
for communications challenged users as well as identification of
appropriate frameworks for future development initiatives.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Many high-risk pathogens that cause disease in
humans are transmitted through various food items. Food-borne
disease constitutes a major public health problem. Assessment of the
quality and safety of foods is important in human health. Rapid and
easy detection of pathogenic organisms will facilitate precautionary
measures to maintain healthy food. The Polymerase Chain Reaction
(PCR) is a handy tool for rapid detection of low numbers of bacteria.
We have designed gene specific primers for most common food
borne pathogens such as Staphylococci, Salmonella and E.coli.
Bacteria were isolated from food samples of various food outlets and
identified using gene specific PCRs. We identified Staphylococci,
Salmonella and E.coli O157 using gene specific primers by rapid and
direct PCR technique in various food samples. This study helps us in
getting a complete picture of the various pathogens that threaten to
cause and spread food borne diseases and it would also enable
establishment of a routine procedure and methodology for rapid
identification of food borne bacteria using the rapid technique of
direct PCR. This study will also enable us to judge the efficiency of
present food safety steps taken by food manufacturers and exporters.
Abstract: The study identified the sources of production
inefficiency of the farming sector in district Faisalabad in the Punjab
province of Pakistan. Data Envelopment Analysis (DEA) technique
was utilized at farm level survey data of 300 farmers for the year
2009. The overall mean efficiency score was 0.78 indicating 22
percent inefficiency of the sample farmers. Computed efficiency
scores were then regressed on farm specific variables using Tobit
regression analysis. Farming experience, education, access to
farming credit, herd size and number of cultivation practices showed
constructive and significant effect on the farmer-s technical
efficiency.
Abstract: A recent neurospiking coding scheme for feature extraction from biosonar echoes of various plants is examined with avariety of stochastic classifiers. Feature vectors derived are employedin well-known stochastic classifiers, including nearest-neighborhood,single Gaussian and a Gaussian mixture with EM optimization.Classifiers' performances are evaluated by using cross-validation and bootstrapping techniques. It is shown that the various classifers perform equivalently and that the modified preprocessing configuration yields considerably improved results.
Abstract: Data mining incorporates a group of statistical
methods used to analyze a set of information, or a data set. It operates
with models and algorithms, which are powerful tools with the great
potential. They can help people to understand the patterns in certain
chunk of information so it is obvious that the data mining tools have
a wide area of applications. For example in the theoretical chemistry
data mining tools can be used to predict moleculeproperties or
improve computer-assisted drug design. Classification analysis is one
of the major data mining methodologies. The aim of thecontribution
is to create a classification model, which would be able to deal with a
huge data set with high accuracy. For this purpose logistic regression,
Bayesian logistic regression and random forest models were built
using R software. TheBayesian logistic regression in Latent GOLD
software was created as well. These classification methods belong to
supervised learning methods.
It was necessary to reduce data matrix dimension before construct
models and thus the factor analysis (FA) was used. Those models
were applied to predict the biological activity of molecules, potential
new drug candidates.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: This paper covers various aspects of the Internet film
piracy. In order to successfully deal with this matter, it is needed to
recognize and explain various motivational factors related to film
piracy. Thus, this study proposes groups of economical, sociopsychological
and other factors that could motivate individuals
to engage in pirate activities. The paper also studies the interactions
between downloaders and uploaders and offers the causality of the
motivational factors and its effects on the film industry.
Moreover, the study also focuses on proposed scheme of relations
of downloading movies and the possible effect on box office
revenues.
Abstract: Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.
Abstract: A new genetic algorithm, termed the 'optimum individual monogenetic genetic algorithm' (OIMGA), is presented whose properties have been deliberately designed to be well suited to hardware implementation. Specific design criteria were to ensure fast access to the individuals in the population, to keep the required silicon area for hardware implementation to a minimum and to incorporate flexibility in the structure for the targeting of a range of applications. The first two criteria are met by retaining only the current optimum individual, thereby guaranteeing a small memory requirement that can easily be stored in fast on-chip memory. Also, OIMGA can be easily reconfigured to allow the investigation of problems that normally warrant either large GA populations or individuals many genes in length. Local convergence is achieved in OIMGA by retaining elite individuals, while population diversity is ensured by continually searching for the best individuals in fresh regions of the search space. The results given in this paper demonstrate that both the performance of OIMGA and its convergence time are superior to those of a range of existing hardware GA implementations.
Abstract: Diffuse viral encephalitis may lack fever and other cardinal signs of infection and hence its distinction from other acute encephalopathic illnesses is challenging. Often, the EEG changes seen routinely are nonspecific and reflect diffuse encephalopathic changes only. The aim of this study was to use nonlinear dynamic mathematical techniques for analyzing the EEG data in order to look for any characteristic diagnostic patterns in diffuse forms of encephalitis.It was diagnosed on clinical, imaging and cerebrospinal fluid criteria in three young male patients. Metabolic and toxic encephalopathies were ruled out through appropriate investigations. Digital EEGs were done on the 3rd to 5th day of onset. The digital EEGs of 5 male and 5 female age and sex matched healthy volunteers served as controls.Two sample t-test indicated that there was no statistically significant difference between the average values in amplitude between the two groups. However, the standard deviation (or variance) of the EEG signals at FP1-F7 and FP2-F8 are significantly higher for the patients than the normal subjects. The regularisation dimension is significantly less for the patients (average between 1.24-1.43) when compared to the normal persons (average between 1.41-1.63) for the EEG signals from all locations except for the Fz-Cz signal. Similarly the wavelet dimension is significantly less (P = 0.05*) for the patients (1.122) when compared to the normal person (1.458). EEGs are subdued in the case of the patients with presence of uniform patterns, manifested in the values of regularisation and wavelet dimensions, when compared to the normal person, indicating a decrease in chaotic nature.
Abstract: Terminal localization for indoor Wireless Local Area
Networks (WLANs) is critical for the deployment of location-aware
computing inside of buildings. A major challenge is obtaining high
localization accuracy in presence of fluctuations of the received signal
strength (RSS) measurements caused by multipath fading. This paper
focuses on reducing the effect of the distance-varying noise by spatial
filtering of the measured RSS. Two different survey point geometries
are tested with the noise reduction technique: survey points arranged
in sets of clusters and survey points uniformly distributed over the
network area. The results show that the location accuracy improves
by 16% when the filter is used and by 18% when the filter is applied
to a clustered survey set as opposed to a straight-line survey set.
The estimated locations are within 2 m of the true location, which
indicates that clustering the survey points provides better localization
accuracy due to superior noise removal.
Abstract: An application of the highly biosensor based on pH-sensitive field immobilized urease for urea analysis was demo The main analytical characteristics of the bios determined; the conditions of urea measureme blood were optimized. A conceptual possibility biosensor for detection of urea concentratio patients suffering from renal insufficiency was sensitive and selective effect transistor and monstrated in this work. iosensor developed were ment in real samples of ility of application of the tion in blood serum of as shown.
Abstract: The aim of the study was to identify seat belt wearing
factor among road users in Malaysia. Evidence-based approach
through in-depth crash investigation was utilised to determine the
intended objectives. The objective was scoped into crashes
investigated by Malaysian Institute of Road Safety Research
(MIROS) involving passenger vehicles within 2007 and 2010. Crash
information of a total of 99 crash cases involving 240 vehicles and
864 occupants were obtained during the study period. Statistical test
and logistic regression analysis have been performed. Results of the
analysis revealed that gender, seat position and age were associated
with seat belt wearing compliance in Malaysia. Males are 97.6%
more likely to wear seat belt compared to females (95% CI 1.317 to
2.964). By seat position, the finding indicates that frontal occupants
were 82 times more likely to be wearing seat belt (95% CI 30.199 to
225.342) as compared to rear occupants. It is also important to note
that the odds of seat belt wearing increased by about 2.64% (95% CI
1.0176 to 1.0353) for every one year increase in age. This study is
essential in understanding the Malaysian tendency in belting up
while being occupied in a vehicle. The factors highlighted in this
study should be emphasized in road safety education in order to
increase seat belt wearing rate in this country and ultimately in
preventing deaths due to road crashes.
Abstract: This paper presents an application of 5S lean technology to a production facility. Due to increased demand, high product variety, and a push production system, the plant has suffered from excessive wastes, unorganized workstations, and unhealthy work environment. This has translated into increased production cost, frequent delays, and low workers morale. Under such conditions, it has become difficult, if not impossible, to implement effective continuous improvement studies. Hence, the lean project is aimed at diagnosing the production process, streamlining the workflow, removing/reducing process waste, cleaning the production environment, improving plant layout, and organizing workstations. 5S lean technology is utilized for achieving project objectives. The work was a combination of both culture changes and tangible/physical changes on the shop floor. The project has drastically changed the plant and developed the infrastructure for a successful implementation of continuous improvement as well as other best practices and quality initiatives.
Abstract: In present communication, we have developed the
suitable constraints for the given the mean codeword length and the
measures of entropy. This development has proved that Renyi-s
entropy gives the minimum value of the log of the harmonic mean
and the log of power mean. We have also developed an important
relation between best 1:1 code and the uniquely decipherable code by
using different measures of entropy.
Abstract: Due to the environmental and price issues of current
energy crisis, scientists and technologists around the globe are
intensively searching for new environmentally less-impact form of
clean energy that will reduce the high dependency on fossil fuel.
Particularly hydrogen can be produced from biomass via thermochemical
processes including pyrolysis and gasification due to the
economic advantage and can be further enhanced through in-situ
carbon dioxide removal using calcium oxide. This work focuses on
the synthesis and development of the flowsheet for the enhanced
biomass gasification process in PETRONAS-s iCON process
simulation software. This hydrogen prediction model is conducted at
operating temperature between 600 to 1000oC at atmospheric
pressure. Effects of temperature, steam-to-biomass ratio and
adsorbent-to-biomass ratio were studied and 0.85 mol fraction of
hydrogen is predicted in the product gas. Comparisons of the results
are also made with experimental data from literature. The
preliminary economic potential of developed system is RM 12.57 x
106 which equivalent to USD 3.77 x 106 annually shows economic
viability of this process.
Abstract: Tread design has evolved over the years to achieve the common tread pattern used in current vehicles. However, to meet safety and comfort requirements, tread design considers more than one design factor. Tread design must consider the grip and drainage, and the manner in which to reduce rolling noise, which is one of the main factors considered by manufacturers. The main objective of this study was the application the computational fluid dynamics (CFD) technique to simulate the contact surface of the tire and ground. The results demonstrated an air-Pumping and large pressure drop effect in the process of contact surface. The results also revealed that the pressure can be used to analyze sound pressure level (SPL).
Abstract: In a world worried about water resources with the
shadow of drought and famine looming all around, the quality of
water is as important as its quantity. The source of all concerns is the
constant reduction of per capita quality water for different uses.
Iran With an average annual precipitation of 250 mm compared to
the 800 mm world average, Iran is considered a water scarce country
and the disparity in the rainfall distribution, the limitations of
renewable resources and the population concentration in the margins
of desert and water scarce areas have intensified the problem.
The shortage of per capita renewable freshwater and its poor
quality in large areas of the country, which have saline, brackish or
hard water resources, and the profusion of natural and artificial
pollutant have caused the deterioration of water quality.
Among methods of treatment and use of these waters one can refer
to the application of membrane technologies, which have come into
focus in recent years due to their great advantages. This process is
quite efficient in eliminating multi-capacity ions; and due to the
possibilities of production at different capacities, application as
treatment process in points of use, and the need for less energy in
comparison to Reverse Osmosis processes, it can revolutionize the
water and wastewater sector in years to come. The article studied the
different capacities of water resources in the Persian Gulf and Oman
Sea watershed basins, and processes the possibility of using
nanofiltration process to treat brackish and non-conventional waters
in these basins.