Abstract: The purpose of this study was to present a reliable mean for human-computer interfacing based on finger gestures made in two dimensions, which could be interpreted and adequately used in controlling a remote robot's movement. The gestures were captured and interpreted using an algorithm based on trigonometric functions, in calculating the angular displacement from one point of touch to another as the user-s finger moved within a time interval; thereby allowing for pattern spotting of the captured gesture. In this paper the design and implementation of such a gesture based user interface was presented, utilizing the aforementioned algorithm. These techniques were then used to control a remote mobile robot's movement. A resistive touch screen was selected as the gesture sensor, then utilizing a programmed microcontroller to interpret them respectively.
Abstract: Unsteady magnetohydrodynamics (MHD) boundary
layer flow and heat transfer over a continuously stretching surface in
the presence of radiation is examined. By similarity transformation,
the governing partial differential equations are transformed to a set of
ordinary differential equations. Numerical solutions are obtained by
employing the Runge-Kutta-Fehlberg method scheme with shooting
technique in Maple software environment. The effects of
unsteadiness parameter, radiation parameter, magnetic parameter and
Prandtl number on the heat transfer characteristics are obtained and
discussed. It is found that the heat transfer rate at the surface
increases as the Prandtl number and unsteadiness parameter increase
but decreases with magnetic and radiation parameter.
Abstract: Automatic keyphrase extraction is useful in efficiently
locating specific documents in online databases. While several
techniques have been introduced over the years, improvement on
accuracy rate is minimal. This research examines attribute scores for
author-supplied keyphrases to better understand how the scores affect
the accuracy rate of automatic keyphrase extraction. Five attributes
are chosen for examination: Term Frequency, First Occurrence, Last
Occurrence, Phrase Position in Sentences, and Term Cohesion
Degree. The results show that First Occurrence is the most reliable
attribute. Term Frequency, Last Occurrence and Term Cohesion
Degree display a wide range of variation but are still usable with
suggested tweaks. Only Phrase Position in Sentences shows a totally
unpredictable pattern. The results imply that the commonly used
ranking approach which directly extracts top ranked potential phrases
from candidate keyphrase list as the keyphrases may not be reliable.
Abstract: Mathematical, graphical and intuitive models are often
constructed in the development process of computational systems.
The Unified Modeling Language (UML) is one of the most popular
modeling languages used by practicing software engineers. This
paper critically examines UML models and suggests an augmented
use case view with the addition of new constructs for modeling
software. It also shows how a use case diagram can be enhanced. The
improved modeling constructs are presented with examples for
clarifying important design and implementation issues.
Abstract: The main objective of this paper is applying a
comparison between the Wolf Pack Search (WPS) as a newly
introduced intelligent algorithm with several other known algorithms
including Particle Swarm Optimization (PSO), Shuffled Frog
Leaping (SFL), Binary and Continues Genetic algorithms. All
algorithms are applied on two benchmark cost functions. The aim is
to identify the best algorithm in terms of more speed and accuracy in
finding the solution, where speed is measured in terms of function
evaluations. The simulation results show that the SFL algorithm with
less function evaluations becomes first if the simulation time is
important, while if accuracy is the significant issue, WPS and PSO
would have a better performance.
Abstract: In this paper, we propose a novel adaptive voltage control strategy for boost converter via Inverse LQ Servo-Control. Our presented strategy is based on an analytical formula of Inverse Linear Quadratic (ILQ) design method, which is not necessary to solve Riccati’s equation directly. The optimal and adaptive controller of the voltage control system is designed. The stability and the robust control are analyzed. Whereas, we can get the analytical solution for the optimal and robust voltage control is achieved through the natural angular velocity within a single parameter and we can change the responses easily via the ILQ control theory. Our method provides effective results as the stable responses and the response times are not drifted even if the condition is changed widely.
Abstract: Zero inflated strict arcsine model is a newly developed
model which is found to be appropriate in modeling overdispersed
count data. In this study, we extend zero inflated strict arcsine model
to zero inflated strict arcsine regression model by taking into
consideration the extra variability caused by extra zeros and
covariates in count data. Maximum likelihood estimation method is
used in estimating the parameters for this zero inflated strict arcsine
regression model.
Abstract: Apart from geometry, functionality is one of the most
significant hallmarks of a product. The functionality of a product can
be considered as the fundamental justification for a product
existence. Therefore a functional analysis including a complete and
reliable descriptor has a high potential to improve product
development process in various fields especially in knowledge-based
design. One of the important applications of the functional analysis
and indexing is in retrieval and design reuse concept. More than 75%
of design activity for a new product development contains reusing
earlier and existing design know-how. Thus, analysis and
categorization of product functions concluded by functional
indexing, influences directly in design optimization. This paper
elucidates and evaluates major classes for functional analysis by
discussing their major methods. Moreover it is finalized by
presenting a noble hybrid approach for functional analysis.
Abstract: This paper considers the autonomous navigation
problem of multiple n-link nonholonomic mobile manipulators within
an obstacle-ridden environment. We present a set of nonlinear
acceleration controllers, derived from the Lyapunov-based control
scheme, which generates collision-free trajectories of the mobile
manipulators from initial configurations to final configurations in a
constrained environment cluttered with stationary solid objects of
different shapes and sizes. We demonstrate the efficiency of the
control scheme and the resulting acceleration controllers of the
mobile manipulators with results through computer simulations of an
interesting scenario.
Abstract: Recent advancements in sensor technologies and
Wireless Body Area Networks (WBANs) have led to the
development of cost-effective healthcare devices which can be used
to monitor and analyse a person-s physiological parameters from
remote locations. These advancements provides a unique opportunity
to overcome current healthcare challenges of low quality service
provisioning, lack of easy accessibility to service varieties, high costs
of services and increasing population of the elderly experienced
globally. This paper reports on a prototype implementation of an
architecture that seamlessly integrates Wireless Body Area Network
(WBAN) with Web services (WS) to proactively collect
physiological data of remote patients to recommend diagnostic
services. Technologies based upon WBAN and WS can provide
ubiquitous accessibility to a variety of services by allowing
distributed healthcare resources to be massively reused to provide
cost-effective services without individuals physically moving to the
locations of those resources. In addition, these technologies can
reduce costs of healthcare services by allowing individuals to access
services to support their healthcare. The prototype uses WBAN body
sensors implemented on arduino fio platforms to be worn by the
patient and an android smart phone as a personal server. The
physiological data are collected and uploaded through GPRS/internet
to the Medical Health Server (MHS) to be analysed. The prototype
monitors the activities, location and physiological parameters such as
SpO2 and Heart Rate of the elderly and patients in rehabilitation.
Medical practitioners would have real time access to the uploaded
information through a web application.
Abstract: This paper presents optimal based damping controllers of Unified Power Flow Controller (UPFC) for improving the damping power system oscillations. The design problem of UPFC damping controller and system configurations is formulated as an optimization with time domain-based objective function by means of Adaptive Tabu Search (ATS) technique. The UPFC is installed in Single Machine Infinite Bus (SMIB) for the performance analysis of the power system and simulated using MATLAB-s simulink. The simulation results of these studies showed that designed controller has an tremendous capability in damping power system oscillations.
Abstract: Hybrid knowledge model is suggested as an underlying
framework for product development management. It can support such
hybrid features as ontologies and rules. Effective collaboration in
product development environment depends on sharing and reasoning
product information as well as engineering knowledge. Many studies
have considered product information and engineering knowledge.
However, most previous research has focused either on building the
ontology of product information or rule-based systems of engineering
knowledge. This paper shows that F-logic based knowledge model can
support such desirable features in a hybrid way.
Abstract: Ontology is widely being used as a tool for organizing
information, creating the relation between the subjects within the
defined knowledge domain area. Various fields such as Civil,
Biology, and Management have successful integrated ontology in
decision support systems for managing domain knowledge and to
assist their decision makers. Gross pollutant traps (GPT) are devices
used in trapping and preventing large items or hazardous particles in
polluting and entering our waterways. However choosing and
determining GPT is a challenge in Malaysia as there are inadequate
GPT data repositories being captured and shared. Hence ontology is
needed to capture, organize and represent this knowledge into
meaningful information which can be contributed to the efficiency of
GPT selection in Malaysia urbanization. A GPT Ontology framework
is therefore built as the first step to capture GPT knowledge which
will then be integrated into the decision support system. This paper
will provide several examples of the GPT ontology, and explain how
it is constructed by using the Protégé tool.
Abstract: The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for crossdomain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: Property investment in the real estate industry has a
high risk due to the uncertainty factors that will affect the decisions
made and high cost. Analytic hierarchy process has existed for some
time in which referred to an expert-s opinion to measure the
uncertainty of the risk factors for the risk analysis. Therefore,
different level of experts- experiences will create different opinion
and lead to the conflict among the experts in the field. The objective
of this paper is to propose a new technique to measure the uncertainty
of the risk factors based on multidimensional data model and data
mining techniques as deterministic approach. The propose technique
consist of a basic framework which includes four modules: user,
technology, end-user access tools and applications. The property
investment risk analysis defines as a micro level analysis as the
features of the property will be considered in the analysis in this
paper.
Abstract: In this paper we present semantic assistant agent
(SAA), an open source digital library agent which takes user query
for finding information in the digital library and takes resources-
metadata and stores it semantically. SAA uses Semantic Web to
improve browsing and searching for resources in digital library. All
metadata stored in the library are available in RDF format for
querying and processing by SemanSreach which is a part of SAA
architecture. The architecture includes a generic RDF-based model
that represents relationships among objects and their components.
Queries against these relationships are supported by an RDF triple
store.
Abstract: Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.
Abstract: This paper presents an algorithm for the recognition
and tracking of moving objects, 1/10 scale model car is used to verify
performance of the algorithm. Presented algorithm for the recognition
and tracking of moving objects in the paper is as follows. SURF
algorithm is merged with Lucas-Kanade algorithm. SURF algorithm
has strong performance on contrast, size, rotation changes and it
recognizes objects but it is slow due to many computational
complexities. Processing speed of Lucas-Kanade algorithm is fast but
the recognition of objects is impossible. Its optical flow compares the
previous and current frames so that can track the movement of a pixel.
The fusion algorithm is created in order to solve problems which
occurred using the Kalman Filter to estimate the position and the
accumulated error compensation algorithm was implemented. Kalman
filter is used to create presented algorithm to complement problems
that is occurred when fusion two algorithms. Kalman filter is used to
estimate next location, compensate for the accumulated error. The
resolution of the camera (Vision Sensor) is fixed to be 640x480. To
verify the performance of the fusion algorithm, test is compared to
SURF algorithm under three situations, driving straight, curve, and
recognizing cars behind the obstacles. Situation similar to the actual is
possible using a model vehicle. Proposed fusion algorithm showed
superior performance and accuracy than the existing object
recognition and tracking algorithms. We will improve the performance
of the algorithm, so that you can experiment with the images of the
actual road environment.
Abstract: Glaucoma diagnosis involves extracting three features
of the fundus image; optic cup, optic disc and vernacular. Present
manual diagnosis is expensive, tedious and time consuming. A
number of researches have been conducted to automate this process.
However, the variability between the diagnostic capability of an
automated system and ophthalmologist has yet to be established. This
paper discusses the efficiency and variability between
ophthalmologist opinion and digital technique; threshold. The
efficiency and variability measures are based on image quality
grading; poor, satisfactory or good. The images are separated into
four channels; gray, red, green and blue. A scientific investigation
was conducted on three ophthalmologists who graded the images
based on the image quality. The images are threshold using multithresholding
and graded as done by the ophthalmologist. A
comparison of grade from the ophthalmologist and threshold is made.
The results show there is a small variability between result of
ophthalmologists and digital threshold.