Abstract: Freeways are originally designed to provide high
mobility to road users. However, the increase in population and
vehicle numbers has led to increasing congestions around the world.
Daily recurrent congestion substantially reduces the freeway capacity
when it is most needed. Building new highways and expanding the
existing ones is an expensive solution and impractical in many
situations. Intelligent and vision-based techniques can, however, be
efficient tools in monitoring highways and increasing the capacity of
the existing infrastructures. The crucial step for highway monitoring
is vehicle detection. In this paper, we propose one of such
techniques. The approach is based on artificial neural networks
(ANN) for vehicles detection and counting. The detection process
uses the freeway video images and starts by automatically extracting
the image background from the successive video frames. Once the
background is identified, subsequent frames are used to detect
moving objects through image subtraction. The result is segmented
using Sobel operator for edge detection. The ANN is, then, used in
the detection and counting phase. Applying this technique to the
busiest freeway in Riyadh (King Fahd Road) achieved higher than
98% detection accuracy despite the light intensity changes, the
occlusion situations, and shadows.
Abstract: Rice, which is the staple food in Sierra Leone, is
consumed on a daily basis. It is the most imperative food crop
extensively grown by farmers across all ecologies in the country.
Though much attention is now given to rice grain production through
the small holder commercialization programme (SHCP), however, no
attention has been given in investigating the limitations faced by rice
producers. This paper will contribute to attempts to overcome the
development challenges caused by food insecurity. The objective of
this paper is thus, to analysis the relationship between rice production
and the domestic retail price of rice. The study employed a log linear
model in which, the quantity of rice produced is the dependent
variable, quantity of rice imported, price of imported rice and price of
domestic rice as explanatory variables. Findings showed that, locally
produced rice is even more expensive than the imported rice per ton,
and almost all the inhabitants in the capital city which hosts about
65% of the entire population of the country favor imported rice, as it
is free from stones with other impurities. On the other hand, to
control price and simultaneously increase rice production, the
government should purchase the rice from the farmers and then sell to private retailers.
Abstract: The effects of down slope steepness on soil splash distribution under a water drop impact have been investigated in this study. The equipment used are the burette to simulate a water drop, a splash cup filled with sandy soil which forms the source area and a splash board to collect the ejected particles. The results found in this study have shown that the apparent mass increased with increasing downslope angle following a linear regression equation with high coefficient of determination. In the same way, the radial soil splash distribution over the distance has been analyzed statistically, and an exponential function was the best fit of the relationship for the different slope angles. The curves and the regressions equations validate the well known FSDF and extend the theory of Van Dijk.
Abstract: Text-based game is supposed to be a low resource
consumption application that delivers good performances when
compared to graphical-intensive type of games. But, nowadays, some
of the online text-based games are not offering performances that are
acceptable to the users. Therefore, an online text-based game called
Star_Quest has been developed in order to analyze its behavior under
different performance measurements. Performance metrics such as
throughput, scalability, response time and page loading time are
captured to yield the performance of the game. The techniques in
performing the load testing are also disclosed to exhibit the viability
of our work. The comparative assessment between the results
obtained and the accepted level of performances are conducted as to
determine the performance level of the game. The study reveals that
the developed game managed to meet all the performance objectives
set forth.
Abstract: PPX(Pretty Printer for XML) is a query language that offers a concise description method of formatting the XML data into HTML. In this paper, we propose a simple specification of formatting method that is a combination description of automatic layout operators and variables in the layout expression of the GENERATE clause of PPX. This method can automatically format irregular XML data included in a part of XML with layout decision rule that is referred to DTD. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing same tasks.
Abstract: Business process model describes process flow of a
business and can be seen as the requirement for developing a
software application. This paper discusses a BPM2CD guideline
which complements the Model Driven Architecture concept by
suggesting how to create a platform-independent software model in
the form of a UML class diagram from a business process model. An
important step is the identification of UML classes from the business
process model. A technique for object-oriented analysis called
domain analysis is borrowed and key concepts in the business
process model will be discovered and proposed as candidate classes
for the class diagram. The paper enhances this step by using ontology
search to help identify important classes for the business domain. As
ontology is a source of knowledge for a particular domain which
itself can link to ontologies of related domains, the search can give a
refined set of candidate classes for the resulting class diagram.
Abstract: Chemically defined Schlegel-s medium was modified
to improve production of cell growth and other metabolites that are
produced by fluorescent pseudomonad R62 strain. The modified
medium does not require pH control as pH changes are kept within ±
0.2 units of the initial pH 7.1 during fermentation. The siderophore
production was optimized for the fluorescent pseudomonad strain in
the modified medium containing 1% glycerol as a major carbon
source supplemented with 0.05% succinic acid and 0.5% Ltryptophan.
Indole-3 acetic acid (IAA) production was higher when
L-tryptophan was used at 0.5%. The 2,4- diacetylphloroglucinol
(DAPG) was higher with amended three trace elements in medium.
The optimized medium produced 2.28 g/l of dry cell mass and 900
mg/l of siderophore at the end of 36 h cultivation, while the
production levels of IAA and DAPG were 65 mg/l and 81 mg/l
respectively at the end of 48 h cultivation.
Abstract: This article outlines conceptualization and
implementation of an intelligent system capable of extracting
knowledge from databases. Use of hybridized features of both the
Rough and Fuzzy Set theory render the developed system flexibility
in dealing with discreet as well as continuous datasets. A raw data set
provided to the system, is initially transformed in a computer legible
format followed by pruning of the data set. The refined data set is
then processed through various Rough Set operators which enable
discovery of parameter relationships and interdependencies. The
discovered knowledge is automatically transformed into a rule base
expressed in Fuzzy terms. Two exemplary cancer repository datasets
(for Breast and Lung Cancer) have been used to test and implement
the proposed framework.
Abstract: Camera calibration is an indispensable step for augmented
reality or image guided applications where quantitative information
should be derived from the images. Usually, a camera
calibration is obtained by taking images of a special calibration object
and extracting the image coordinates of projected calibration marks
enabling the calculation of the projection from the 3d world coordinates
to the 2d image coordinates. Thus such a procedure exhibits
typical steps, including feature point localization in the acquired
images, camera model fitting, correction of distortion introduced by
the optics and finally an optimization of the model-s parameters. In
this paper we propose to extend this list by further step concerning
the identification of the optimal subset of images yielding the smallest
overall calibration error. For this, we present a Monte Carlo based
algorithm along with a deterministic extension that automatically
determines the images yielding an optimal calibration. Finally, we
present results proving that the calibration can be significantly
improved by automated image selection.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: This article discusses the customs and traditions in
Turkestan in the late XIXth and early XXth centuries. Having a long
history, Turkestan is well-known as the birthplace of many nations
and nationalities. The name of Turkestan is also given to it for a
reason - the land of the Turkic peoples who inhabited Central Asia
and united under together. Currently, nations and nationalities of the
Turkestan region formed their own sovereign states, and every year
they prove their country names in the world community. Political,
economic importance of Turkestan, which became the gold wire
between Asia and Europe was always very high. So systematically
various aggressive actions were made by several great powers. As a
result of expansionary policy of colonization of the Russian Empire -
the Turkestan has appeared.
Abstract: Studies in neuroscience suggest that both global and
local feature information are crucial for perception and recognition of
faces. It is widely believed that local feature is less sensitive to
variations caused by illumination, expression and illumination. In
this paper, we target at designing and learning local features for face
recognition. We designed three types of local features. They are
semi-global feature, local patch feature and tangent shape feature.
The designing of semi-global feature aims at taking advantage of
global-like feature and meanwhile avoiding suppressing AdaBoost
algorithm in boosting weak classifies established from small local
patches. The designing of local patch feature targets at automatically
selecting discriminative features, and is thus different with traditional
ways, in which local patches are usually selected manually to cover
the salient facial components. Also, shape feature is considered in
this paper for frontal view face recognition. These features are
selected and combined under the framework of boosting algorithm
and cascade structure. The experimental results demonstrate that the
proposed approach outperforms the standard eigenface method and
Bayesian method. Moreover, the selected local features and
observations in the experiments are enlightening to researches in
local feature design in face recognition.
Abstract: Based on the feature of model disturbances and uncertainty being compensated dynamically in auto – disturbances-rejection-controller (ADRC), a new method using ADRC is proposed for the decoupling control of dispenser longitudinal movement in big flight envelope. Developed from nonlinear model directly, ADRC is especially suitable for dynamic model that has big disturbances. Furthermore, without changing the structure and parameters of the controller in big flight envelope, this scheme can simplify the design of flight control system. The simulation results in big flight envelope show that the system achieves high dynamic performance, steady state performance and the controller has strong robustness.
Abstract: This paper proposes a specialized Web robot to automatically collect objectionable Web contents for use in an objectionable Web content classification system, which creates the URL database of objectionable Web contents. It aims at shortening the update period of the DB, increasing the number of URLs in the DB, and enhancing the accuracy of the information in the DB.
Abstract: The objectives of this research were 1) to study the
opinions of newspaper journalists about their trustworthiness in the
National Press Council of Thailand (NPCT) and the NPCT-s success
in regulating the professional ethics; and 2) to study the differences
among mean vectors of the variables of trustworthiness in the NPCT
and opinions on the NPCT-s success in regulating professional ethics
among samples working at different work positions and from
different affiliation of newspaper organizations. The results showed
that 1) Interaction effects between the variables of work positions and
affiliation were not statistically significant at the confidence level of
0.05. 2) There was a statistically significant difference (p
Abstract: This paper describes simple implementation of
homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed
point methods. In homoptopy methods, an embedding
parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed method
Abstract: Fuzzy logic control (FLC) systems have been tested in
many technical and industrial applications as a useful modeling tool
that can handle the uncertainties and nonlinearities of modern control
systems. The main drawback of the FLC methodologies in the
industrial environment is challenging for selecting the number of
optimum tuning parameters.
In this paper, a method has been proposed for finding the optimum
membership functions of a fuzzy system using particle swarm
optimization (PSO) algorithm. A synthetic algorithm combined from
fuzzy logic control and PSO algorithm is used to design a controller
for a continuous stirred tank reactor (CSTR) with the aim of
achieving the accurate and acceptable desired results. To exhibit the
effectiveness of proposed algorithm, it is used to optimize the
Gaussian membership functions of the fuzzy model of a nonlinear
CSTR system as a case study. It is clearly proved that the optimized
membership functions (MFs) provided better performance than a
fuzzy model for the same system, when the MFs were heuristically
defined.
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Innovation, technology and knowledge are the trilogy
of impact to support the challenges arising from uncertainty.
Evidence showed an opportunity to ask how to manage in this
environment under constant innovation. In an attempt to get a
response from the field of Management Sciences, based in the
Contingency Theory, a research was conducted, with
phenomenological and descriptive approaches, using the Case Study
Method and the usual procedures for this task involving a focus
group composed of managers and employees working in the
pharmaceutical field. The problem situation was raised; the state of
the art was interpreted and dissected the facts. In this tasks were
involved four establishments. The result indicates that these focused
ventures have been managed by its founder empirically and is
experimenting agility described in this work. The expectation of this
study is to improve concepts for stakeholders on creativity in
business.
Abstract: Decentralized eco-sanitation system is a promising and sustainable mode comparing to the century-old centralized conventional sanitation system. The decentralized concept relies on an environmentally and economically sound management of water, nutrient and energy fluxes. Source-separation systems for urban waste management collect different solid waste and wastewater streams separately to facilitate the recovery of valuable resources from wastewater (energy, nutrients). A resource recovery centre constituted for 20,000 people will act as the functional unit for the treatment of urban waste of a high-density population community, like Singapore. The decentralized system includes urine treatment, faeces and food waste co-digestion, and horticultural waste and organic fraction of municipal solid waste treatment in composting plants. A design model is developed to estimate the input and output in terms of materials and energy. The inputs of urine (yellow water, YW) and faeces (brown water, BW) are calculated by considering the daily mean production of urine and faeces by humans and the water consumption of no-mix vacuum toilet (0.2 and 1 L flushing water for urine and faeces, respectively). The food waste (FW) production is estimated to be 150 g wet weight/person/day. The YW is collected and discharged by gravity into tank. It was found that two days are required for urine hydrolysis and struvite precipitation. The maximum nitrogen (N) and phosphorus (P) recovery are 150-266 kg/day and 20-70 kg/day, respectively. In contrast, BW and FW are mixed for co-digestion in a thermophilic acidification tank and later a decentralized/centralized methanogenic reactor is used for biogas production. It is determined that 6.16-15.67 m3/h methane is produced which is equivalent to 0.07-0.19 kWh/ca/day. The digestion residues are treated with horticultural waste and organic fraction of municipal waste in co-composting plants.