Abstract: The purpose of this study was to present a reliable mean for human-computer interfacing based on finger gestures made in two dimensions, which could be interpreted and adequately used in controlling a remote robot's movement. The gestures were captured and interpreted using an algorithm based on trigonometric functions, in calculating the angular displacement from one point of touch to another as the user-s finger moved within a time interval; thereby allowing for pattern spotting of the captured gesture. In this paper the design and implementation of such a gesture based user interface was presented, utilizing the aforementioned algorithm. These techniques were then used to control a remote mobile robot's movement. A resistive touch screen was selected as the gesture sensor, then utilizing a programmed microcontroller to interpret them respectively.
Abstract: Cosmic showers, during the transit through space, produce
sub - products as a result of interactions with the intergalactic
or interstellar medium which after entering earth generate secondary
particles called Extensive Air Shower (EAS). Detection and analysis
of High Energy Particle Showers involve a plethora of theoretical and
experimental works with a host of constraints resulting in inaccuracies
in measurements. Therefore, there exist a necessity to develop a
readily available system based on soft-computational approaches
which can be used for EAS analysis. This is due to the fact that soft
computational tools such as Artificial Neural Network (ANN)s can be
trained as classifiers to adapt and learn the surrounding variations. But
single classifiers fail to reach optimality of decision making in many
situations for which Multiple Classifier System (MCS) are preferred
to enhance the ability of the system to make decisions adjusting
to finer variations. This work describes the formation of an MCS
using Multi Layer Perceptron (MLP), Recurrent Neural Network
(RNN) and Probabilistic Neural Network (PNN) with data inputs
from correlation mapping Self Organizing Map (SOM) blocks and
the output optimized by another SOM. The results show that the setup
can be adopted for real time practical applications for prediction
of primary energy and location of EAS from density values captured
using detectors in a circular grid.
Abstract: This paper tests the level of market integration between Malaysia and Singapore stock markets with the world market. Kalman Filter (KF) methodology is used on the International Capital Asset Pricing Model (ICAPM) and the pricing errors estimated within the framework of ICAPM are used as a measure of market integration or segmentation. The advantage of the KF technique is that it allows for time-varying coefficients in estimating ICAPM and hence able to capture the varying degree of market integration. Empirical results show clear evidence of varying degree of market integration for both case of Malaysia and Singapore. Furthermore, the results show that the changes in the level of market integration are found to coincide with certain economic events that have taken placed. The findings certainly provide evidence on the practicability of the KF technique to estimate stock markets integration. In the comparison between Malaysia and Singapore stock market, the result shows that the trends of the market integration indices for Malaysia and Singapore look similar through time but the magnitude is notably different with the Malaysia stock market showing greater degree of market integration. Finally, significant evidence of varying degree of market integration shows the inappropriate use of OLS in estimating the level of market integration.
Abstract: This paper invites to dialogue and reflections on
innovation and entrepreneurship by presenting concepts of innovation
leading to the introduction of a complex theoretical framework;
Cooperative Innovation (CO-IN). CO-IN is a didactic model
enhancing and scaffolding processes of cooperation creating
innovation drawing on a Scandinavian tradition.
CO-IN is based on a cross-sectorial and multidisciplinary
approach. We introduce the concept of complementarity to help
capture the validity of diversity and we suggest the concept of “the
space in between" to understand the creation of identity as a
collective mind. We see dialogue and the use of multi modal
techniques as essential tools for conceptualizations giving possibility
for clarification of the complexity and diversity leading to decision
making based on knowledge as commons.
We introduce the didactic design and present our empirical
findings from an innovation workshop in Argentina. In a final
paragraph we reflect on the design as a support of the development of
common ground, collective mind and collective action and the
creation of knowledge as commons to facilitate innovation and
entrepreneurship.
Abstract: The following paper shows an interactive tool which
main purpose is to teach how to play a flute. It consists of three
stages the first one is the instruction and teaching process through a
software application, the second is the practice part when the user
starts to play the flute (hardware specially designed for this
application) this flute is capable of capturing how is being played the
flute and the final stage is the one in which the data captured are sent
to the software and the user is evaluated in order to give him / she a
correction or an acceptance
Abstract: One of the most used assumptions in logic programming
and deductive databases is the so-called Closed World Assumption
(CWA), according to which the atoms that cannot be inferred
from the programs are considered to be false (i.e. a pessimistic
assumption). One of the most successful semantics of conventional
logic programs based on the CWA is the well-founded semantics.
However, the CWA is not applicable in all circumstances when
information is handled. That is, the well-founded semantics, if
conventionally defined, would behave inadequately in different cases.
The solution we adopt in this paper is to extend the well-founded
semantics in order for it to be based also on other assumptions. The
basis of (default) negative information in the well-founded semantics
is given by the so-called unfounded sets. We extend this concept
by considering optimistic, pessimistic, skeptical and paraconsistent
assumptions, used to complete missing information from a program.
Our semantics, called extended well-founded semantics, expresses
also imperfect information considered to be missing/incomplete,
uncertain and/or inconsistent, by using bilattices as multivalued
logics. We provide a method of computing the extended well-founded
semantics and show that Kripke-Kleene semantics is captured by
considering a skeptical assumption. We show also that the complexity
of the computation of our semantics is polynomial time.
Abstract: Power transformer consists of components which are
under consistent thermal and electrical stresses. The major
component which degrades under these stresses is the paper
insulation of the power transformer. At site, lightning impulses and
cable faults may cause the winding deformation. In addition, the
winding may deform due to impact during transportation. A
deformed winding will excite more stress to its insulating paper thus
will degrade it. Insulation degradation will shorten the life-span of
the transformer. Currently there are two methods of detecting the
winding deformation which are Sweep Frequency Response
Analysis (SFRA) and Low Voltage Impulse Test (LVI). The latter
injects current pulses to the winding and capture the admittance
plot. In this paper, a transformer which experienced overheating and
arcing was identified, and both SFRA and LVI were performed.
Next, the transformer was brought to the factory for untanking. The
untanking results revealed that the LVI is more accurate than the
SFRA method for this case study.
Abstract: This paper proposes a method, combining color and
layout features, for identifying documents captured from lowresolution
handheld devices. On one hand, the document image color
density surface is estimated and represented with an equivalent
ellipse and on the other hand, the document shallow layout structure
is computed and hierarchically represented. The combined color and
layout features are arranged in a symbolic file, which is unique for
each document and is called the document-s visual signature. Our
identification method first uses the color information in the
signatures in order to focus the search space on documents having a
similar color distribution, and finally selects the document having the
most similar layout structure in the remaining search space. Finally,
our experiment considers slide documents, which are often captured
using handheld devices.
Abstract: This research proposes an algorithm for the simulation
of time-periodic unsteady problems via the solution unsteady Euler
and Navier-Stokes equations. This algorithm which is called Time
Spectral method uses a Fourier representation in time and hence
solve for the periodic state directly without resolving transients
(which consume most of the resources in a time-accurate scheme).
Mathematical tools used here are discrete Fourier transformations. It
has shown tremendous potential for reducing the computational cost
compared to conventional time-accurate methods, by enforcing
periodicity and using Fourier representation in time, leading to
spectral accuracy. The accuracy and efficiency of this technique is
verified by Euler and Navier-Stokes calculations for pitching airfoils.
Because of flow turbulence nature, Baldwin-Lomax turbulence
model has been used at viscous flow analysis. The results presented
by the Time Spectral method are compared with experimental data. It
has shown tremendous potential for reducing the computational cost
compared to the conventional time-accurate methods, by enforcing
periodicity and using Fourier representation in time, leading to
spectral accuracy, because results verify the small number of time
intervals per pitching cycle required to capture the flow physics.
Abstract: A fusion classifier composed of two modules, one made by a hidden Markov model (HMM) and the other by a support vector machine (SVM), is proposed to recognize faces with pose variations in open-set recognition settings. The HMM module captures the evolution of facial features across a subject-s face using the subject-s facial images only, without referencing to the faces of others. Because of the captured evolutionary process of facial features, the HMM module retains certain robustness against pose variations, yielding low false rejection rates (FRR) for recognizing faces across poses. This is, however, on the price of poor false acceptance rates (FAR) when recognizing other faces because it is built upon withinclass samples only. The SVM module in the proposed model is developed following a special design able to substantially diminish the FAR and further lower down the FRR. The proposed fusion classifier has been evaluated in performance using the CMU PIE database, and proven effective for open-set face recognition with pose variations. Experiments have also shown that it outperforms the face classifier made by HMM or SVM alone.
Abstract: Distant-talking voice-based HCI system suffers from
performance degradation due to mismatch between the acoustic
speech (runtime) and the acoustic model (training). Mismatch is
caused by the change in the power of the speech signal as observed at
the microphones. This change is greatly influenced by the change in
distance, affecting speech dynamics inside the room before reaching
the microphones. Moreover, as the speech signal is reflected, its
acoustical characteristic is also altered by the room properties. In
general, power mismatch due to distance is a complex problem. This
paper presents a novel approach in dealing with distance-induced
mismatch by intelligently sensing instantaneous voice power variation
and compensating model parameters. First, the distant-talking speech
signal is processed through microphone array processing, and the
corresponding distance information is extracted. Distance-sensitive
Gaussian Mixture Models (GMMs), pre-trained to capture both
speech power and room property are used to predict the optimal
distance of the speech source. Consequently, pre-computed statistic
priors corresponding to the optimal distance is selected to correct
the statistics of the generic model which was frozen during training.
Thus, model combinatorics are post-conditioned to match the power
of instantaneous speech acoustics at runtime. This results to an
improved likelihood in predicting the correct speech command at
farther distances. We experiment using real data recorded inside two
rooms. Experimental evaluation shows voice recognition performance
using our method is more robust to the change in distance compared
to the conventional approach. In our experiment, under the most
acoustically challenging environment (i.e., Room 2: 2.5 meters), our
method achieved 24.2% improvement in recognition performance
against the best-performing conventional method.
Abstract: As a part of the development of a numerical method of
close capture exhausts systems for machining devices, a test rig
recreating a situation similar to a grinding operation, but in a
perfectly controlled environment, is used. The properties of the
obtained spray of solid particles are initially characterized using
particle tracking velocimetry (PTV), in order to obtain input and
validation parameters for numerical simulations. The dispersion of a
tracer gas (SF6) emitted simultaneously with the particle jet is then
studied experimentally, as the dispersion of such a gas is
representative of that of finer particles, whose aerodynamic response
time is negligible. Finally, complete modeling of the test rig is
achieved to allow comparison with experimental results and thus to
progress towards validation of the models used to describe a twophase
flow generated by machining operation.
Abstract: An improved processing description to be employed in biosonar signal processing in a cochlea model is proposed and examined. It is compared to conventional models using a modified discrimination analysis and both are tested. Their performances are evaluated with echo data captured from natural targets (trees).Results indicate that the phase characteristics of low-pass filters employed in the echo processing have a significant effect on class separability for this data.
Abstract: Lake Nasser is one of the largest reservoirs in the
world. Over 120 million metric tons of sediments are deposited in its
dead storage zone every year. The main objective of the present work
was to determine the physical and chemical characteristics of Lake
Nasser sediments. The sample had a relatively low surface area of 2.9
m2/g which increased more than 3-fold upon chemical activation. The
main chemical elements of the raw sediments were C, O and Si with
some traces of Al, Fe and Ca. The organic functional groups for the
tested sample included O-H, C=C, C-H and C-O, with indications of
Si-O and other metal-C and/or metal-O bonds normally associated
with clayey materials. Potentiometric titration of the sample in
different ionic strength backgrounds revealed an alkaline material with
very strong positive surface charge at pH values just a little less than
the pH of zero charge which is ~9. Surface interactions of the
sediments with the background electrolyte were significant. An
advanced surface complexation model was able to capture these
effects, employing a single-site approach to represent protolysis
reactions in aqueous solution, and to determine the significant surface
species in the pH range of environmental interest.
Abstract: The performance of a sucrose-based H2 production in
a completely stirred tank reactor (CSTR) was modeled by neural
network back-propagation (BP) algorithm. The H2 production was
monitored over a period of 450 days at 35±1 ºC. The proposed model
predicts H2 production rates based on hydraulic retention time
(HRT), recycle ratio, sucrose concentration and degradation, biomass
concentrations, pH, alkalinity, oxidation-reduction potential (ORP),
acids and alcohols concentrations. Artificial neural networks (ANNs)
have an ability to capture non-linear information very efficiently. In
this study, a predictive controller was proposed for management and
operation of large scale H2-fermenting systems. The relevant control
strategies can be activated by this method. BP based ANNs modeling
results was very successful and an excellent match was obtained
between the measured and the predicted rates. The efficient H2
production and system control can be provided by predictive control
method combined with the robust BP based ANN modeling tool.
Abstract: This paper introduces an intelligent system, which can be applied in the monitoring of vehicle speed using a single camera. The ability of motion tracking is extremely useful in many automation problems and the solution to this problem will open up many future applications. One of the most common problems in our daily life is the speed detection of vehicles on a highway. In this paper, a novel technique is developed to track multiple moving objects with their speeds being estimated using a sequence of video frames. Field test has been conducted to capture real-life data and the processed results were presented. Multiple object problems and noisy in data are also considered. Implementing this system in real-time is straightforward. The proposal can accurately evaluate the position and the orientation of moving objects in real-time. The transformations and calibration between the 2D image and the actual road are also considered.
Abstract: Owing to the stringent environmental legislations,
CO2 capture and sequestration is one of the viable solutions to reduce
the CO2 emissions from various sources. In this context, Ionic liquids
(ILs) are being investigated as suitable absorption media for CO2
capture. Due to their non-evaporative, non-toxic, and non-corrosive
nature, these ILs have the potential to replace the existing solvents
like aqueous amine solutions for CO2 separation technologies. Thus,
the present work aims at studying the important aspects such as the
interactions of CO2 molecule with different anions (F-, Br-, Cl-, NO3
-,
BF4
-, PF6
-, Tf2N-, and CF3SO3
-) that are commonly used in ILs
through molecular modeling. In this, the minimum energy structures
have been obtained using Ab initio based calculations at MP2
(Moller-Plesset perturbation) level. Results revealed various degrees
of distortion of CO2 molecule (from its linearity) with the anions
studied, most likely due to the Lewis acid-base interactions between
CO2 and anion. Furthermore, binding energies for the anion-CO2
complexes were also calculated. The implication of anion-CO2
interactions to the solubility of CO2 in ionic liquids is also discussed.
Abstract: Recommender systems are usually regarded as an
important marketing tool in the e-commerce. They use important
information about users to facilitate accurate recommendation. The
information includes user context such as location, time and interest
for personalization of mobile users. We can easily collect information
about location and time because mobile devices communicate with the
base station of the service provider. However, information about user
interest can-t be easily collected because user interest can not be
captured automatically without user-s approval process. User interest
usually represented as a need. In this study, we classify needs into two
types according to prior research. This study investigates the
usefulness of data mining techniques for classifying user need type for
recommendation systems. We employ several data mining techniques
including artificial neural networks, decision trees, case-based
reasoning, and multivariate discriminant analysis. Experimental
results show that CHAID algorithm outperforms other models for
classifying user need type. This study performs McNemar test to
examine the statistical significance of the differences of classification
results. The results of McNemar test also show that CHAID performs
better than the other models with statistical significance.
Abstract: In Thailand, the practice of pre-hospital Emergency
Medical Service (EMS) in each area reveals the different growth
rates and effectiveness of the practices. Those can be found as the
diverse quality and quantity. To shorten the learning curve prior to
speed-up the practices in other areas, story telling and lessons learnt
from the effective practices are valued as meaningful knowledge. To
this paper, it was to ascertain the factors, lessons learnt and best
practices that have impact as contributing to the success of prehospital
EMS system. Those were formulized as model prior to
speedup the practice in other areas. To develop the model, Malcolm
Baldrige National Quality Award (MBNQA), which is widely
recognized as a framework for organizational quality assessment and
improvement, was chosen as the discussion framework. Remarkably,
this study was based on the consideration of knowledge capture;
however it was not to complete the loop of knowledge activities.
Nevertheless, it was to highlight the recognition of knowledge
capture, which is the initiation of knowledge management.
Abstract: Aspect Oriented Programming promises many
advantages at programming level by incorporating the cross cutting
concerns into separate units, called aspects. Join Points are
distinguishing features of Aspect Oriented Programming as they
define the points where core requirements and crosscutting concerns
are (inter)connected. Currently, there is a problem of multiple
aspects- composition at the same join point, which introduces the
issues like ordering and controlling of these superimposed aspects.
Dynamic strategies are required to handle these issues as early as
possible. State chart is an effective modeling tool to capture dynamic
behavior at high level design. This paper provides methodology to
formulate the strategies for multiple aspect composition at high level,
which helps to better implement these strategies at coding level. It
also highlights the need of designing shared join point at high level,
by providing the solutions of these issues using state chart diagrams
in UML 2.0. High level design representation of shared join points
also helps to implement the designed strategy in systematic way.