Abstract: This paper presents an Extended Kaman Filter
implementation of a single-camera Visual Simultaneous Localization
and Mapping algorithm, a novel algorithm for simultaneous
localization and mapping problem widely studied in mobile robotics
field. The algorithm is vision and odometry-based, The odometry
data is incremental, and therefore it will accumulate error over time,
since the robot may slip or may be lifted, consequently if the
odometry is used alone we can not accurately estimate the robot
position, in this paper we show that a combination of odometry and
visual landmark via the extended Kalman filter can improve the robot
position estimate. We use a Pioneer II robot and motorized pan tilt
camera models to implement the algorithm.
Abstract: In this paper we discuss a set of guidelines which
could be adapted when designing an audio user interface for the
visually impaired. It is based on an audio environment that is
focused on audio positioning. Unlike current applications which only
interpret Graphical User Interface (GUI) for the visually impaired,
this particular audio environment bypasses GUI to provide a direct
auditory output. It presents the capability of two dimensional (2D)
navigation on audio interfaces. This paper highlights the significance
of a 2D audio environment with spatial information in the context
of the visually impaired. A thorough usability study has been conducted
to prove the applicability of proposed design guidelines for
these auditory interfaces. While proving these guidelines, previously
unearthed design aspects have been revealed in this study.
Abstract: This paper presents a new approach for the prob-ability density function estimation using the Support Vector Ma-chines (SVM) and the Expectation Maximization (EM) algorithms.In the proposed approach, an advanced algorithm for the SVM den-sity estimation which incorporates the Mean Field theory in the learning process is used. Instead of using ad-hoc values for the para-meters of the kernel function which is used by the SVM algorithm,the proposed approach uses the EM algorithm for an automatic optimization of the kernel. Experimental evaluation using simulated data set shows encouraging results.
Abstract: In this paper, we consider the control of time delay system
by Proportional-Integral (PI) controller. By Using the Hermite-
Biehler theorem, which is applicable to quasi-polynomials, we seek
a stability region of the controller for first order delay systems. The
essence of this work resides in the extension of this approach to
second order delay system, in the determination of its stability region
and the computation of the PI optimum parameters. We have used
the genetic algorithms to lead the complexity of the optimization
problem.
Abstract: Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.
Abstract: Domain-specific languages describe specific solutions to problems in the application domain. Traditionally they form a solution composing black-box abstractions together. This, usually, involves non-deep transformations over the target model. In this paper we argue that it is potentially powerful to operate with grey-box abstractions to build a domain-specific software system. We present parametric code templates as grey-box abstractions and conceptual tools to encapsulate and manipulate these templates. Manipulations introduce template-s merging routines and can be defined in a generic way. This involves reasoning mechanisms at the code templates level. We introduce the concept of Neurath Modelling Language (NML) that operates with parametric code templates and specifies a visualisation mapping mechanism for target models. Finally we provide an example of calculating a domain-specific software system with predefined NML elements.
Abstract: The Neuro-Fuzzy hybridization scheme has become
of research interest in pattern classification over the past decade. The
present paper proposes a novel Modified Adaptive Fuzzy Inference
Engine (MAFIE) for pattern classification. A modified Apriori
algorithm technique is utilized to reduce a minimal set of decision
rules based on input output data sets. A TSK type fuzzy inference
system is constructed by the automatic generation of membership
functions and rules by the fuzzy c-means clustering and Apriori
algorithm technique, respectively. The generated adaptive fuzzy
inference engine is adjusted by the least-squares fit and a conjugate
gradient descent algorithm towards better performance with a
minimal set of rules. The proposed MAFIE is able to reduce the
number of rules which increases exponentially when more input
variables are involved. The performance of the proposed MAFIE is
compared with other existing applications of pattern classification
schemes using Fisher-s Iris and Wisconsin breast cancer data sets and
shown to be very competitive.
Abstract: This paper explores the scalability issues associated
with solving the Named Entity Recognition (NER) problem using
Support Vector Machines (SVM) and high-dimensional features. The
performance results of a set of experiments conducted using binary
and multi-class SVM with increasing training data sizes are
examined. The NER domain chosen for these experiments is the
biomedical publications domain, especially selected due to its
importance and inherent challenges. A simple machine learning
approach is used that eliminates prior language knowledge such as
part-of-speech or noun phrase tagging thereby allowing for its
applicability across languages. No domain-specific knowledge is
included. The accuracy measures achieved are comparable to those
obtained using more complex approaches, which constitutes a
motivation to investigate ways to improve the scalability of multiclass
SVM in order to make the solution more practical and useable.
Improving training time of multi-class SVM would make support
vector machines a more viable and practical machine learning
solution for real-world problems with large datasets. An initial
prototype results in great improvement of the training time at the
expense of memory requirements.
Abstract: The shortest path routing problem is a multiobjective nonlinear optimization problem with constraints. This problem has been addressed by considering Quality of service parameters, delay and cost objectives separately or as a weighted sum of both objectives. Multiobjective evolutionary algorithms can find multiple pareto-optimal solutions in one single run and this ability makes them attractive for solving problems with multiple and conflicting objectives. This paper uses an elitist multiobjective evolutionary algorithm based on the Non-dominated Sorting Genetic Algorithm (NSGA), for solving the dynamic shortest path routing problem in computer networks. A priority-based encoding scheme is proposed for population initialization. Elitism ensures that the best solution does not deteriorate in the next generations. Results for a sample test network have been presented to demonstrate the capabilities of the proposed approach to generate well-distributed pareto-optimal solutions of dynamic routing problem in one single run. The results obtained by NSGA are compared with single objective weighting factor method for which Genetic Algorithm (GA) was applied.
Abstract: Data warehousing success is not high enough. User
dissatisfaction and failure to adhere to time frames and budgets are
too common. Most traditional information systems practices are
rooted in hard systems thinking. Today, the great systems thinkers
are forgotten by information systems developers. A data warehouse
is still a system and it is worth investigating whether systems
thinkers such as Churchman can enhance our practices today. This
paper investigates data warehouse development practices from a
systems thinking perspective. An empirical investigation is done in
order to understand the everyday practices of data warehousing
professionals from a systems perspective. The paper presents a
model for the application of Churchman-s systems approach in data
warehouse development.
Abstract: Dynamic location referencing method is an important technology to shield map differences. These method references objects of the road network by utilizing condensed selection of its real-world geographic properties stored in a digital map database, which overcomes the defections existing in pre-coded location referencing methods. The high attributes completeness requirements and complicated reference point selection algorithm are the main problems of recent researches. Therefore, a dynamic location referencing algorithm combining intersection points selected at the extremities compulsively and road link points selected according to link partition principle was proposed. An experimental system based on this theory was implemented. The tests using Beijing digital map database showed satisfied results and thus verified the feasibility and practicability of this method.
Abstract: MANEMO is the integration of Network Mobility
(NEMO) and Mobile Ad Hoc Network (MANET). A MANEMO
node has an interface to both a MANET and NEMO network, and
therefore should choose the optimal interface for packet delivery,
however such a handover between interfaces will introduce packet
loss. We define the steps necessary for a MANEMO handover,
using Mobile IP and NEMO to signal the new binding to the
relevant Home Agent(s). The handover steps aim to minimize the
packet loss by avoiding waiting for Duplicate Address Detection
and Neighbour Unreachability Detection. We present expressions for
handover delay and packet loss, and then use numerical examples to
evaluate a MANEMO handover. The analysis shows how the packet
loss depends on level of nesting within NEMO, the delay between
Home Agents and the load on the MANET, and hence can be used
to developing optimal MANEMO handover algorithms.
Abstract: Ontology-based modelling of multi-formatted
software application content is a challenging area in content
management. When the number of software content unit is huge and
in continuous process of change, content change management is
important. The management of content in this context requires
targeted access and manipulation methods. We present a novel
approach to deal with model-driven content-centric information
systems and access to their content. At the core of our approach is an
ontology-based semantic annotation technique for diversely
formatted content that can improve the accuracy of access and
systems evolution. Domain ontologies represent domain-specific
concepts and conform to metamodels. Different ontologies - from
application domain ontologies to software ontologies - capture and
model the different properties and perspectives on a software content
unit. Interdependencies between domain ontologies, the artifacts and
the content are captured through a trace model. The annotation traces
are formalised and a graph-based system is selected for the
representation of the annotation traces.
Abstract: This paper quantifies the impact of providing a shortterm
excess active power support of a variable speed wind turbine
(VSWT) and effect of super magnetic energy storage (SMES) unit on
frequency control, particularly temporary minimum frequency (TMF)
term. To demonstrate the effect of these factors on the power system
frequency, a three-area power system is considered as a test system.
Abstract: Regression testing is a maintenance activity applied to
modified software to provide confidence that the changed parts are
correct and that the unchanged parts have not been adversely affected
by the modifications. Regression test selection techniques reduce the
cost of regression testing, by selecting a subset of an existing test
suite to use in retesting modified programs. This paper presents the
first general regression-test-selection technique, which based on code
and allows selecting test cases for any programs written in any
programming language. Then it handles incomplete program. We
also describe RTSDiff, a regression-test-selection system that
implements the proposed technique. The results of the empirical
studied that performed in four programming languages java, C#, Cµ
and Visual basic show that the efficiency and effective in reducing
the size of test suit.
Abstract: This paper attempts to explore a new method to
improve the teaching of algorithmic for beginners. It is well known
that algorithmic is a difficult field to teach for teacher and complex to
assimilate for learner. These difficulties are due to intrinsic
characteristics of this field and to the manner that teachers (the
majority) apprehend its bases. However, in a Technology Enhanced
Learning environment (TEL), assessment, which is important and
indispensable, is the most delicate phase to implement, for all
problems that generate (noise...). Our objective registers in the
confluence of these two axes. For this purpose, EASEL focused
essentially to elaborate an assessment approach of algorithmic
competences in a TEL environment. This approach consists in
modeling an algorithmic solution according to basic and elementary
operations which let learner draw his/her own step with all autonomy
and independently to any programming language. This approach
assures a trilateral assessment: summative, formative and diagnostic
assessment.
Abstract: Image watermarking has become an important tool for
intellectual property protection and authentication. In this paper a
watermarking technique is suggested that incorporates two
watermarks in a host image for improved protection and robustness.
A watermark, in form of a PN sequence (will be called the secondary
watermark), is embedded in the wavelet domain of a primary
watermark before being embedded in the host image. The technique
has been tested using Lena image as a host and the camera man as
the primary watermark. The embedded PN sequence was detectable
through correlation among other five sequences where a PSNR of
44.1065 dB was measured. Furthermore, to test the robustness of the
technique, the watermarked image was exposed to four types of
attacks, namely compression, low pass filtering, salt and pepper noise
and luminance change. In all cases the secondary watermark was
easy to detect even when the primary one is severely distorted.
Abstract: One of the most growing areas in the embedded community is multimedia devices. Multimedia devices incorporate a number of complicated functions for their operation, like motion estimation. A multitude of different implementations have been proposed to reduce motion estimation complexity, such as spiral search. We have studied the implementations of spiral search and identified areas of improvement. We propose a modified spiral search algorithm, with lower computational complexity compared to the original spiral search. We have implemented our algorithm on an embedded ARM based architecture, with custom memory hierarchy. The resulting system yields energy consumption reduction up to 64% and performance increase up to 77%, with a small penalty of 2.3 dB, in average, of video quality compared with the original spiral search algorithm.
Abstract: This paper describes a novel method for automatic
estimation of the contours of weld defect in radiography images.
Generally, the contour detection is the first operation which we apply
in the visual recognition system. Our approach can be described as a
region based maximum likelihood formulation of parametric
deformable contours. This formulation provides robustness against
the poor image quality, and allows simultaneous estimation of the
contour parameters together with other parameters of the model.
Implementation is performed by a deterministic iterative algorithm
with minimal user intervention. Results testify for the very good
performance of the approach especially in synthetic weld defect
images.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Application
developers extend the framework to build their particular
applications using hooks. Hooks are the places identified to show
how to use and customize the framework. Hooks define the
Framework Interface Classes (FICs) and their possible specifications,
which helps in building reusable test cases for the implementations of
these classes. This paper introduces a novel technique called all
paths-state to generate state-based test cases to test the FICs at class
level. The technique is experimentally evaluated. The empirical
evaluation shows that all paths-state technique produces test cases
with a high degree of coverage for the specifications of the
implemented FICs comparing to test cases generated using round-trip
path and all-transition techniques.