Abstract: Increased energy demand and the concern about
environment friendly technology, renewable bio-fuels are better
alternative to petroleum products. In the present study linseed oil was
used as alternative source for diesel engine fuel and the results were
compared with baseline data of neat diesel. Performance parameters
such as brake thermal efficiency (BTE) and brake specific fuel
consumption (BSFC) and emissions parameters such as CO,
unburned hydro carbon (UBHC), NOx, CO2 and exhaust temperature
were compared. BTE of the engine was lower and BSFC was higher
when the engine was fueled with Linseed oil compared to diesel fuel.
Emission characteristics are better than diesel fuel. NOx formation by
using linseed oil during the experiment was lower than diesel fuel.
Linseed oil is non edible oil, so it can be used as an extender of diesel
fuel energy source for small and medium energy needs.
Abstract: In today-s world, the efficient utilization of wood
resources comes more and more to the mind of forest owners. It is a
very complex challenge to ensure an efficient harvest of the wood
resources. This is one of the scopes the project “Virtual Forest II"
addresses. Its core is a database with data about forests containing
approximately 260 million trees located in North Rhine-Westphalia
(NRW). Based on this data, tree growth simulations and wood
mobilization simulations can be conducted. This paper focuses on the
latter. It describes a discrete-event-simulation with an attached 3-D
real time visualization which simulates timber harvest using trees
from the database with different crop resources. This simulation can
be displayed in 3-D to show the progress of the wood crop. All the
data gathered during the simulation is presented as a detailed
summary afterwards. This summary includes cost-benefit
calculations and can be compared to those of previous runs to
optimize the financial outcome of the timber harvest by exchanging
crop resources or modifying their parameters.
Abstract: This paper will present the implementation of QoS
policy based system by utilizing rules on Access Control List (ACL)
over Layer 3 (L3) switch. Also presented is the architecture on that
implementation; the tools being used and the result were gathered.
The system architecture has an ability to control ACL rules which are
installed inside an external L3 switch. ACL rules used to instruct the
way of access control being executed, in order to entertain all traffics
through that particular switch. The main advantage of using this
approach is that the single point of failure could be prevented when
there are any changes on ACL rules inside L3 switches. Another
advantage is that the agent could instruct ACL rules automatically
straight away based on the changes occur on policy database without
configuring them one by one. Other than that, when QoS policy
based system was implemented in distributed environment, the
monitoring process can be synchronized easily due to the automate
process running by agent over external policy devices.
Abstract: Retrieval of the surface reflectance is important in the
remotely sensed data analysis to obtain the atmospheric reflectance or
atmospheric correction. The relationship between visible and mid
infrared reflectance over land was investigated and developed in this
study. The surface reflectances of the two visible bands were
measured using a handheld spectroradiometer collected around
Penang Island. In this study, we use the assumption that the 2.1 μm
band is not affected by aerosol and it is transparent to most aerosol
types (except dust). Therefore the satellite observed signal is the
same as the surface signal in 2.1 μm band. The correlation between
the surface reflectance measured by the spectroradiometer in the blue
and red region and the 2.1 μm observed by the satellite has been
established. We investigate five dates of Landsat TM scenes in this
study. The finding obtained by this study indicates that the surface
reflectance can be retrieved from the 2.1 μm band.
Abstract: This work proposes a set of actions to assist redesign
procedure in existing products of Electric and Electronic Equipment
(EEE). The aim is to improve their environmental behavior after their
withdrawal in the End-of-Life (EOL) phase. In the beginning data
collection takes place. Then follows selection and implementation of
the optimal EOL Treatment Strategy (EOL_TS) and its results-
evaluation concerning the environment. In parallel, product design
characteristics that can be altered are selected based on their
significance for the environment in the EOL stage. All results from
the previous stages are combined and possible redesign actions are
formulated for further examination and afterwards configuration in
the design stage. The applied method to perform these tasks is Lean
Thinking (LT). At the end, results concerning the application of the
proposed method on a distribution transformer are presented.
Abstract: Although the Vietnamese catfish farming has grown
at very high rates in recent years, the industry has also faced many
problems affecting its sustainability. This paper studies the
perceptions of catfish farmers regarding risk and risk management
strategies in their production activities. Specifically, the study aims
to measure the consequences, likelihoods, and levels of risks as well
as the efficacy of risk management in Vietnamese catfish farming.
Data for the study were collected through a sample of 261 catfish
farmers in the Mekong Delta, Vietnam using a questionnaire survey
in 2008. Results show that, in general, price and production risks
were perceived as the most important risks. Farm management and
technical measures were perceived more effective than other kinds of
risk management strategies in risk reduction. Although price risks
were rated as important risks, price risk management strategies were
not perceived as important measures for risk mitigation. The results
of the study are discussed to provide implications for various
industry stakeholders, including policy makers, processors, advisors,
and developers of new risk management strategies.
Abstract: Learning using labeled and unlabelled data has
received considerable amount of attention in the machine learning
community due its potential in reducing the need for expensive
labeled data. In this work we present a new method for combining
labeled and unlabeled data based on classifier ensembles. The model
we propose assumes each classifier in the ensemble observes the
input using different set of features. Classifiers are initially trained
using some labeled samples. The trained classifiers learn further
through labeling the unknown patterns using a teaching signals that is
generated using the decision of the classifier ensemble, i.e. the
classifiers self-supervise each other. Experiments on a set of object
images are presented. Our experiments investigate different classifier
models, different fusing techniques, different training sizes and
different input features. Experimental results reveal that the proposed
self-supervised ensemble learning approach reduces classification
error over the single classifier and the traditional ensemble classifier
approachs.
Abstract: This paper analysis performance of disbursement
procedure of public works project in Thailand. The results of
research were summarised based on contracts, submitted invoice,
inspection dated, copies of disbursement dated between client and
their main contractor and interviewed with persons involved in
central and local government projects during 1994-2008 in Thailand.
The data collection was to investigate the disbursement procedure
related to performance in disbursement during construction period
(Planned duration of contract against Actual execution date in each
month). A graphical presentation of a duration analysis of the
projects illustrated significant disbursement formation in each
project. It was established that the shortage of staff, the financial
stability of clients, bureaucratic, method of disbursement and
economics situation has play major role on performance of
disbursement to their main contractors.
Abstract: The most important property of the Gene Ontology is
the terms. These control vocabularies are defined to provide
consistent descriptions of gene products that are shareable and
computationally accessible by humans, software agent, or other
machine-readable meta-data. Each term is associated with
information such as definition, synonyms, database references, amino
acid sequences, and relationships to other terms. This information has
made the Gene Ontology broadly applied in microarray and
proteomic analysis. However, the process of searching the terms is
still carried out using traditional approach which is based on keyword
matching. The weaknesses of this approach are: ignoring semantic
relationships between terms, and highly depending on a specialist to
find similar terms. Therefore, this study combines semantic similarity
measure and genetic algorithm to perform a better retrieval process
for searching semantically similar terms. The semantic similarity
measure is used to compute similitude strength between two terms.
Then, the genetic algorithm is employed to perform batch retrievals
and to handle the situation of the large search space of the Gene
Ontology graph. The computational results are presented to show the
effectiveness of the proposed algorithm.
Abstract: The overall service performance of I/O intensive
system depends mainly on workload on its storage system. In
heterogeneous storage environment where storage elements from
different vendors with different capacity and performance are put
together, workload should be distributed according to storage
capability. This paper addresses data placement issue in short video
sharing website. Workload contributed by a video is estimated by the
number of views and life time span of existing videos in same
category. Experiment was conducted on 42,000 video titles in six
weeks. Result showed that the proposed algorithm distributed
workload and maintained balance better than round robin and random
algorithms.
Abstract: This presentation narrates the comparative analysis of
the dissolution data nimesulide microparticles prepared with
ethylcellulose, hydroxypropyl methylcellulose, chitosan and
Poly(D,L-lactide-co-glycolide) as polymers. The analysis of release
profiles showed that the variations noted in the release behavior of
nimesulide from various microparticulate formulations are due to the
nature of used polymer. In addition, maximum retardation in the
nimesulide release was observed with HPMC (floating particles).
Thus HPMC miacroparticles may be preferably employed for
sustained release dosage form development.
Abstract: In general, reports are a form of representing data in
such way that user gets the information he needs. They can be built in
various ways, from the simplest (“select from") to the most complex
ones (results derived from different sources/tables with complex
formulas applied). Furthermore, rules of calculations could be written
as a program hard code or built in the database to be used by dynamic
code. This paper will introduce two types of reports, defined in the
DB structure. The main goal is to manage calculations in optimal
way, keeping maintenance of reports as simple and smooth as
possible.
Abstract: The novelty proposed in this study is twofold and consists in the developing of a new color similarity metric based on the human visual system and a new color indexing based on a textual approach. The new color similarity metric proposed is based on the color perception of the human visual system. Consequently the results returned by the indexing system can fulfill as much as possibile the user expectations. We developed a web application to collect the users judgments about the similarities between colors, whose results are used to estimate the metric proposed in this study. In order to index the image's colors, we used a text indexing engine to facilitate the integration of visual features in a database of text documents. The textual signature is build by weighting the image's colors in according to their occurrence in the image. The use of a textual indexing engine, provide us a simple, fast and robust solution to index images. A typical usage of the system proposed in this study, is the development of applications whose data type is both visual and textual. In order to evaluate the proposed method we chose a price comparison engine as a case of study, collecting a series of commercial offers containing the textual description and the image representing a specific commercial offer.
Abstract: Economic models are complex dynamic systems with a lot of uncertainties and fuzzy data. Conventional modeling approaches using well known methods and techniques cannot provide realistic and satisfactory answers to today-s challenging economic problems. Qualitative modeling using fuzzy logic and intelligent system theories can be used to model macroeconomic models. Fuzzy Cognitive maps (FCM) is a new method been used to model the dynamic behavior of complex systems. For the first time FCMs and the Mamdani Model of Intelligent control is used to model macroeconomic models. This new model is referred as the Mamdani Rule-Based Fuzzy Cognitive Map (MBFCM) and provides the academic and research community with a new promising integrated advanced computational model. A new economic model is developed for a qualitative approach to Macroeconomic modeling. Fuzzy Controllers for such models are designed. Simulation results for an economic scenario are provided and extensively discussed
Abstract: Islamic institutions in Malaysia play a variety of
socioeconomic roles such as poverty alleviation. To perform this role,
these institutions face a major task in identifying the poverty group.
Most of these institutions measure and operationalize poverty from
the monetary perspective using variables such as income, expenditure
or consumption. In practice, most Islamic institutions in Malaysia use
the monetary approach in measuring poverty through the
conventional Poverty Line Income (PLI) method and recently, the
had al kifayah (HAK) method using total necessities of a household
from an Islamic perspective. The objective of this paper is to present
the PLI and also the HAK method. This micro-data study would
highlight the similarities and differences of both the methods.A
survey aided by a structured questionnaire was carried out on 260
selected head of households in the state of Selangor. The paper
highlights several demographic factors that are associated with the
three monetary indicators in the study, namely income, PLI and
HAK. In addition, the study found that these monetary variables are
significantly related with each other.
Abstract: Many problems in computer vision and image
processing present potential for parallel implementations through one
of the three major paradigms of geometric parallelism, algorithmic
parallelism and processor farming. Static process scheduling
techniques are used successfully to exploit geometric and algorithmic
parallelism, while dynamic process scheduling is better suited to
dealing with the independent processes inherent in the process
farming paradigm. This paper considers the application of parallel or
multi-computers to a class of problems exhibiting spatial data
characteristic of the geometric paradigm. However, by using
processor farming paradigm, a dynamic scheduling technique is
developed to suit the MIMD structure of the multi-computers. A
hybrid scheme of scheduling is also developed and compared with
the other schemes. The specific problem chosen for the investigation
is the Hough transform for line detection.
Abstract: In data mining, the association rules are used to search
for the relations of items of the transactions database. Following the
data is collected and stored, it can find rules of value through
association rules, and assist manager to proceed marketing strategy
and plan market framework. In this paper, we attempt fuzzy partition
methods and decide membership function of quantitative values of
each transaction item. Also, by managers we can reflect the
importance of items as linguistic terms, which are transformed as
fuzzy sets of weights. Next, fuzzy weighted frequent pattern growth
(FWFP-Growth) is used to complete the process of data mining. The
method above is expected to improve Apriori algorithm for its better
efficiency of the whole association rules. An example is given to
clearly illustrate the proposed approach.
Abstract: The direct synthesis process of dimethyl ether (DME)
from syngas in slurry reactors is considered to be promising because
of its advantages in caloric transfer. In this paper, the influences of
operating conditions (temperature, pressure and weight hourly space
velocity) on the conversion of CO, selectivity of DME and methanol
were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst,
which is far more suitable to liquid phase dimethyl ether synthesis
process than bifunctional catalyst commercially. A Langmuir-
Hinshelwood mechanism type global kinetics model for liquid phase
DME direct synthesis based on methanol synthesis models and a
methanol dehydration model has been investigated by fitting our
experimental data. The model parameters were estimated with
MATLAB program based on general Genetic Algorithms and
Levenberg-Marquardt method, which is suitably fitting experimental
data and its reliability was verified by statistical test and residual
error analysis.
Abstract: In this work, new experimental data for slugging
frequency in inclined gas-liquid flow are reported, and a new
correlation is proposed. Scale experiments were carried out using a
mixture of air and water in a 6 m long pipe. Two different pipe
diameters were used, namely, 38 and 67 mm. The data were taken
with capacitance type sensors at a data acquisition frequency of 200
Hz over an interval of 60 seconds. For the range of flow conditions
studied, the liquid superficial velocity is observed to influence the
frequency strongly. A comparison of the present data with
correlations available in the literature reveals a lack of agreement. A
new correlation for slug frequency has been proposed for the inclined
flow, which represents the main contribution of this work.
Abstract: HIV-1 genome is highly heterogeneous. Due to this
variation, features of HIV-I genome is in a wide range. For this
reason, the ability to infection of the virus changes depending on
different chemokine receptors. From this point of view, R5 HIV
viruses use CCR5 coreceptor while X4 viruses use CXCR5 and
R5X4 viruses can utilize both coreceptors. Recently, in
Bioinformatics, R5X4 viruses have been studied to classify by using
the experiments on HIV-1 genome.
In this study, R5X4 type of HIV viruses were classified using
Auto Regressive (AR) model through Artificial Neural Networks
(ANNs). The statistical data of R5X4, R5 and X4 viruses was
analyzed by using signal processing methods and ANNs. Accessible
residues of these virus sequences were obtained and modeled by AR
model since the dimension of residues is large and different from
each other. Finally the pre-processed data was used to evolve various
ANN structures for determining R5X4 viruses. Furthermore ROC
analysis was applied to ANNs to show their real performances. The
results indicate that R5X4 viruses successfully classified with high
sensitivity and specificity values training and testing ROC analysis
for RBF, which gives the best performance among ANN structures.