Abstract: A parallel block method based on Backward
Differentiation Formulas (BDF) is developed for the parallel solution
of stiff Ordinary Differential Equations (ODEs). Most common
methods for solving stiff systems of ODEs are based on implicit
formulae and solved using Newton iteration which requires repeated
solution of systems of linear equations with coefficient matrix, I -
hβJ . Here, J is the Jacobian matrix of the problem. In this paper,
the matrix operations is paralleled in order to reduce the cost of the
iterations. Numerical results are given to compare the speedup and
efficiency of parallel algorithm and that of sequential algorithm.
Abstract: The study of the Andaman Sea can be studied by
using the oceanic model; therefore the grid covering the study area
should be generated. This research aims to generate grid covering
the Andaman Sea, situated between longitudes 90◦E to 101◦E and
latitudes 1◦N to 18◦N. A horizontal grid is an orthogonal curvilinear
with 87 × 217 grid points. The methods used in this study are
cubic spline and bilinear interpolations. The boundary grid points
are generated by spline interpolation while the interior grid points
have to be specified by bilinear interpolation method. A vertical grid
is sigma coordinate with 15 layers of water column.
Abstract: The main aim of Supply Chain Management (SCM) is
to produce, distribute, logistics and deliver goods and equipment in
right location, right time, right amount to satisfy costumers, with
minimum time and cost waste. So implementing techniques that
reduce project time and cost, and improve productivity and
performance is very important. Emerging technologies such as the
Radio Frequency Identification (RFID) are now making it possible to
automate supply chains in a real time manner and making them more
efficient than the simple supply chain of the past for tracing and
monitoring goods and products and capturing data on movements of
goods and other events. This paper considers concepts, components
and RFID technology characteristics by concentration of warehouse
and inventories management. Additionally, utilization of RFID in the
role of improving information management in supply chain is
discussed. Finally, the facts of installation and this technology-s
results in direction with warehouse and inventory management and
business development will be presented.
Abstract: This paper deals with e-government issues at several
levels. Initially we look at the concept of e-government itself in order
to give it a sound framework. Than we look at the e-government
issues at three levels, first we analyse it at the global level, second we
analyse it at the level of transition economies, and finally we take a
closer look on developments in Croatia. The analysis includes actual
progress being made in selected transition economies given the Euro
area averages, along with e-government potential in future
demanding period.
Abstract: In this paper, we develop an accurate and efficient Haar wavelet method for well-known FitzHugh-Nagumo equation. The proposed scheme can be used to a wide class of nonlinear reaction-diffusion equations. The power of this manageable method is confirmed. Moreover the use of Haar wavelets is found to be accurate, simple, fast, flexible, convenient, small computation costs and computationally attractive.
Abstract: Explosive forming is one of the unconventional
techniques in which, most commonly, the water is used as the
pressure transmission medium. One of the newest methods in
explosive forming is gas detonation forming which uses a normal
shock wave derived of gas detonation, to form sheet metals. For this
purpose a detonation is developed from the reaction of H2+O2
mixture in a long cylindrical detonation tube. The detonation wave
goes through the detonation tube and acts as a blast load on the steel
blank and forms it. Experimental results are compared with a finite
element model; and the comparison of the experimental and
numerical results obtained from strain, thickness variation and
deformed geometry is carried out. Numerical and experimental
results showed approximately 75 – 90 % similarity in formability of
desired shape. Also optimum percent of gas mixture obtained when
we mix 68% H2 with 32% O2.
Abstract: This paper presents preliminary results on modeling
and control of a quadrotor UAV. With aerodynamic concepts, a
mathematical model is firstly proposed to describe the dynamics
of the quadrotor UAV. Parameters of this model are identified by
experiments with Matlab Identify Toolbox. A group of PID controllers
are then designed based on the developed model. To verify
the developed model and controllers, simulations and experiments for
altitude control, position control and trajectory tracking are carried
out. The results show that the quadrotor UAV well follows the
referenced commands, which clearly demonstrates the effectiveness
of the proposed approach.
Abstract: Both the minimum energy consumption and
smoothness, which is quantified as a function of jerk, are generally
needed in many dynamic systems such as the automobile and the
pick-and-place robot manipulator that handles fragile equipments.
Nevertheless, many researchers come up with either solely
concerning on the minimum energy consumption or minimum jerk
trajectory. This research paper proposes a simple yet very interesting
when combining the minimum energy and jerk of indirect jerks
approaches in designing the time-dependent system yielding an
alternative optimal solution. Extremal solutions for the cost functions
of the minimum energy, the minimum jerk and combining them
together are found using the dynamic optimization methods together
with the numerical approximation. This is to allow us to simulate
and compare visually and statistically the time history of state inputs
employed by combining minimum energy and jerk designs. The
numerical solution of minimum direct jerk and energy problem are
exactly the same solution; however, the solutions from problem of
minimum energy yield the similar solution especially in term of
tendency.
Abstract: Recently the use of data mining to scientific bibliographic data bases has been implemented to analyze the pathways of the knowledge or the core scientific relevances of a laureated novel or a country. This specific case of data mining has been named citation mining, and it is the integration of citation bibliometrics and text mining. In this paper we present an improved WEB implementation of statistical physics algorithms to perform the text mining component of citation mining. In particular we use an entropic like distance between the compression of text as an indicator of the similarity between them. Finally, we have included the recently proposed index h to characterize the scientific production. We have used this web implementation to identify users, applications and impact of the Mexican scientific institutions located in the State of Morelos.
Abstract: User-based Collaborative filtering (CF), one of the
most prevailing and efficient recommendation techniques, provides
personalized recommendations to users based on the opinions of other
users. Although the CF technique has been successfully applied in
various applications, it suffers from serious sparsity problems. The
cloud-model approach addresses the sparsity problems by
constructing the user-s global preference represented by a cloud
eigenvector. The user-based CF approach works well with dense
datasets while the cloud-model CF approach has a greater
performance when the dataset is sparse. In this paper, we present a
hybrid approach that integrates the predictions from both the
user-based CF and the cloud-model CF approaches. The experimental
results show that the proposed hybrid approach can ameliorate the
sparsity problem and provide an improved prediction quality.
Abstract: The Kumamoto area, Kyushu, Japan has 1,041km2 in
area and about 1milion in population. This area is a greatest area in Japan which depends on groundwater for all of drinking water. Quantity of this local groundwater use is about 200MCM during the
year. It is understood that the main recharging area of groundwater exist in the rice field zone which have high infiltrate height ahead of
100mm/ day of the irrigated water located in the middle area of the Shira-River Basin. However, by decrease of the paddy-rice planting
area by urbanization and an acreage reduction policy, the groundwater income and expenditure turned worse. Then Kumamoto city and four
companies expended financial support to increase recharging water to
underground by ponded water in the field from 2004.
In this paper, the author reported the situation of recovery of groundwater by recharge and estimates the efficiency of recharge by
statistical method.
Abstract: The Spiral development model has been used
successfully in many commercial systems and in a good number of
defense systems. This is due to the fact that cost-effective
incremental commitment of funds, via an analogy of the spiral model
to stud poker and also can be used to develop hardware or integrate
software, hardware, and systems. To support adaptive, semantic
collaboration between domain experts and knowledge engineers, a
new knowledge engineering process, called Spiral_OWL is proposed.
This model is based on the idea of iterative refinement, annotation
and structuring of knowledge base. The Spiral_OWL model is
generated base on spiral model and knowledge engineering
methodology. A central paradigm for Spiral_OWL model is the
concentration on risk-driven determination of knowledge engineering
process. The collaboration aspect comes into play during knowledge
acquisition and knowledge validation phase. Design rationales for the
Spiral_OWL model are to be easy-to-implement, well-organized, and
iterative development cycle as an expanding spiral.
Abstract: In mobile environments, unspecified numbers of transactions
arrive in continuous streams. To prove correctness of their
concurrent execution a method of modelling an infinite number of
transactions is needed. Standard database techniques model fixed
finite schedules of transactions. Lately, techniques based on temporal
logic have been proposed as suitable for modelling infinite schedules.
The drawback of these techniques is that proving the basic
serializability correctness condition is impractical, as encoding (the
absence of) conflict cyclicity within large sets of transactions results
in prohibitively large temporal logic formulae. In this paper, we show
that, under certain common assumptions on the graph structure of
data items accessed by the transactions, conflict cyclicity need only
be checked within all possible pairs of transactions. This results in
formulae of considerably reduced size in any temporal-logic-based
approach to proving serializability, and scales to arbitrary numbers
of transactions.
Abstract: This paper presents the design and implementation of
the WebGD, a CORBA-based document classification and retrieval
system on Internet. The WebGD makes use of such techniques as Web,
CORBA, Java, NLP, fuzzy technique, knowledge-based processing
and database technology. Unified classification and retrieval model,
classifying and retrieving with one reasoning engine and flexible
working mode configuration are some of its main features. The
architecture of WebGD, the unified classification and retrieval model,
the components of the WebGD server and the fuzzy inference engine
are discussed in this paper in detail.
Abstract: This paper describes an automatic algorithm to restore
the shape of three-dimensional (3D) left ventricle (LV) models created
from magnetic resonance imaging (MRI) data using a geometry-driven
optimization approach. Our basic premise is to restore the LV shape
such that the LV epicardial surface is smooth after the restoration. A
geometrical measure known as the Minimum Principle Curvature (κ2)
is used to assess the smoothness of the LV. This measure is used to
construct the objective function of a two-step optimization process.
The objective of the optimization is to achieve a smooth epicardial
shape by iterative in-plane translation of the MRI slices.
Quantitatively, this yields a minimum sum in terms of the magnitude
of κ
2, when κ2 is negative. A limited memory quasi-Newton algorithm,
L-BFGS-B, is used to solve the optimization problem. We tested our
algorithm on an in vitro theoretical LV model and 10 in vivo
patient-specific models which contain significant motion artifacts. The
results show that our method is able to automatically restore the shape
of LV models back to smoothness without altering the general shape of
the model. The magnitudes of in-plane translations are also consistent
with existing registration techniques and experimental findings.
Abstract: The aim of this research is to design a collaborative
framework that integrates risk analysis activities into the geospatial
database design (GDD) process. Risk analysis is rarely undertaken
iteratively as part of the present GDD methods in conformance to
requirement engineering (RE) guidelines and risk standards.
Accordingly, when risk analysis is performed during the GDD, some
foreseeable risks may be overlooked and not reach the output
specifications especially when user intentions are not systematically
collected. This may lead to ill-defined requirements and ultimately in
higher risks of geospatial data misuse. The adopted approach consists
of 1) reviewing risk analysis process within the scope of RE and
GDD, 2) analyzing the challenges of risk analysis within the context
of GDD, and 3) presenting the components of a risk-based
collaborative framework that improves the collection of the
intended/forbidden usages of the data and helps geo-IT experts to
discover implicit requirements and risks.
Abstract: In order to develop forest management strategies in
tropical forest in Malaysia, surveying the forest resources and
monitoring the forest area affected by logging activities is essential.
There are tremendous effort has been done in classification of land
cover related to forest resource management in this country as it is a
priority in all aspects of forest mapping using remote sensing and
related technology such as GIS. In fact classification process is a
compulsory step in any remote sensing research. Therefore, the main
objective of this paper is to assess classification accuracy of
classified forest map on Landsat TM data from difference number of
reference data (200 and 388 reference data). This comparison was
made through observation (200 reference data), and interpretation
and observation approaches (388 reference data). Five land cover
classes namely primary forest, logged over forest, water bodies, bare
land and agricultural crop/mixed horticultural can be identified by
the differences in spectral wavelength. Result showed that an overall
accuracy from 200 reference data was 83.5 % (kappa value
0.7502459; kappa variance 0.002871), which was considered
acceptable or good for optical data. However, when 200 reference
data was increased to 388 in the confusion matrix, the accuracy
slightly improved from 83.5% to 89.17%, with Kappa statistic
increased from 0.7502459 to 0.8026135, respectively. The accuracy
in this classification suggested that this strategy for the selection of
training area, interpretation approaches and number of reference data
used were importance to perform better classification result.
Abstract: Construction projects generally take place in
uncontrolled and dynamic environments where construction waste is
a serious environmental problem in many large cities. The total
amount of waste and carbon dioxide emissions from transportation
vehicles are still out of control due to increasing construction
projects, massive urban development projects and the lack of
effective tools for minimizing adverse environmental impacts in
construction. This research is about utilization of the integrated
applications of automated advanced tracking and data storage
technologies in the area of environmental management to monitor
and control adverse environmental impacts such as construction
waste and carbon dioxide emissions. Radio Frequency Identification
(RFID) integrated with the Global Position System (GPS) provides
an opportunity to uniquely identify materials, components, and
equipments and to locate and track them using minimal or no worker
input. The transmission of data to the central database will be carried
out with the help of Global System for Mobile Communications
(GSM).
Abstract: The statistical distributions are modeled in explaining
nature of various types of data sets. Although these distributions are
mostly uni-modal, it is quite common to see multiple modes in the
observed distribution of the underlying variables, which make the
precise modeling unrealistic. The observed data do not exhibit
smoothness not necessarily due to randomness, but could also be due
to non-randomness resulting in zigzag curves, oscillations, humps
etc. The present paper argues that trigonometric functions, which
have not been used in probability functions of distributions so far,
have the potential to take care of this, if incorporated in the
distribution appropriately. A simple distribution (named as, Sinoform
Distribution), involving trigonometric functions, is illustrated in the
paper with a data set. The importance of trigonometric functions is
demonstrated in the paper, which have the characteristics to make
statistical distributions exotic. It is possible to have multiple modes,
oscillations and zigzag curves in the density, which could be suitable
to explain the underlying nature of select data set.
Abstract: Model Predictive Control (MPC) is increasingly being
proposed for real time applications and embedded systems. However
comparing to PID controller, the implementation of the MPC in
miniaturized devices like Field Programmable Gate Arrays (FPGA)
and microcontrollers has historically been very small scale due to its
complexity in implementation and its computation time requirement.
At the same time, such embedded technologies have become an
enabler for future manufacturing enterprises as well as a transformer
of organizations and markets. Recently, advances in microelectronics
and software allow such technique to be implemented in embedded
systems. In this work, we take advantage of these recent advances
in this area in the deployment of one of the most studied and
applied control technique in the industrial engineering. In fact in
this paper, we propose an efficient framework for implementation
of Generalized Predictive Control (GPC) in the performed STM32
microcontroller. The STM32 keil starter kit based on a JTAG interface
and the STM32 board was used to implement the proposed GPC
firmware. Besides the GPC, the PID anti windup algorithm was
also implemented using Keil development tools designed for ARM
processor-based microcontroller devices and working with C/Cµ
langage. A performances comparison study was done between both
firmwares. This performances study show good execution speed and
low computational burden. These results encourage to develop simple
predictive algorithms to be programmed in industrial standard hardware.
The main features of the proposed framework are illustrated
through two examples and compared with the anti windup PID
controller.