Abstract: This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.
Abstract: IMCS is Integrated Monitoring and Control System for
thermal power plant. This system consists of mainly two parts; controllers and OIS (Operator Interface System). These two parts are
connected by Ethernet-based communication. The controller side of communication is managed by CNet module and OIS side is managed
by data server of OIS. CNet module sends the data of controller to data
server and receives commend data from data server. To minimizes or
balance the load of data server, this module buffers data created by controller at every cycle and send buffered data to data server on request of data server. For multiple data server, this module manages
the connection line with each data server and response for each request
from multiple data server. CNet module is included in each controller
of redundant system. When controller fail-over happens on redundant system, this module can provide data of controller to data sever
without loss. This paper presents three main features – separation of get task, usage of ring buffer and monitoring communication status –of CNet module to carry out these functions.
Abstract: In this paper we investigated a number of the Internet
congestion control algorithms that has been developed in the last few
years. It was obviously found that many of these algorithms were
designed to deal with the Internet traffic merely as a train of
consequent packets. Other few algorithms were specifically tailored
to handle the Internet congestion caused by running media traffic that
represents audiovisual content. This later set of algorithms is
considered to be aware of the nature of this media content. In this
context we briefly explained a number of congestion control
algorithms and hence categorized them into the two following
categories: i) Media congestion control algorithms. ii) Common
congestion control algorithms. We hereby recommend the usage of
the media congestion control algorithms for the reason of being
media content-aware rather than the other common type of
algorithms that blindly manipulates such traffic. We showed that the
spread of such media content-aware algorithms over Internet will
lead to better congestion control status in the coming years. This is
due to the observed emergence of the era of digital convergence
where the media traffic type will form the majority of the Internet
traffic.
Abstract: This paper presents a protocol aiming at proving that an encryption system contains structural weaknesses without disclosing any information on those weaknesses. A verifier can check in a polynomial time that a given property of the cipher system output has been effectively realized. This property has been chosen by the prover in such a way that it cannot been achieved by known attacks or exhaustive search but only if the prover indeed knows some undisclosed weaknesses that may effectively endanger the cryptosystem security. This protocol has been denoted zero-knowledge-like proof of cryptanalysis. In this paper, we apply this protocol to the Bluetooth core encryption algorithm E0, used in many mobile environments and thus we suggest that its security can seriously be put into question.
Abstract: In this paper we describe a hybrid technique of Minimax search and aggregate Mahalanobis distance function synthesis to evolve Awale game player. The hybrid technique helps to suggest a move in a short amount of time without looking into endgame database. However, the effectiveness of the technique is heavily dependent on the training dataset of the Awale strategies utilized. The evolved player was tested against Awale shareware program and the result is appealing.
Abstract: PARIS (Personal Archiving and Retrieving Image
System) is an experiment personal photograph library, which includes
more than 80,000 of consumer photographs accumulated within a
duration of approximately five years, metadata based on our proposed
MPEG-7 annotation architecture, Dozen Dimensional Digital Content
(DDDC), and a relational database structure. The DDDC architecture
is specially designed for facilitating the managing, browsing and
retrieving of personal digital photograph collections. In annotating
process, we also utilize a proposed Spatial and Temporal Ontology
(STO) designed based on the general characteristic of personal
photograph collections. This paper explains PRAIS system.
Abstract: In this paper we improve the quasilinearization method by barycentric Lagrange interpolation because of its numerical stability and computation speed to achieve a stable semi analytical solution. Then we applied the improved method for solving the Fin problem which is a nonlinear equation that occurs in the heat transferring. In the quasilinearization approach the nonlinear differential equation is treated by approximating the nonlinear terms by a sequence of linear expressions. The modified QLM is iterative but not perturbative and gives stable semi analytical solutions to nonlinear problems without depending on the existence of a smallness parameter. Comparison with some numerical solutions shows that the present solution is applicable.
Abstract: Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.
Abstract: Face recognition is a technique to automatically
identify or verify individuals. It receives great attention in
identification, authentication, security and many more applications.
Diverse methods had been proposed for this purpose and also a lot of
comparative studies were performed. However, researchers could not
reach unified conclusion. In this paper, we are reporting an extensive
quantitative accuracy analysis of four most widely used face
recognition algorithms: Principal Component Analysis (PCA),
Independent Component Analysis (ICA), Linear Discriminant
Analysis (LDA) and Support Vector Machine (SVM) using AT&T,
Sheffield and Bangladeshi people face databases under diverse
situations such as illumination, alignment and pose variations.
Abstract: The competitive learning is an adaptive process in
which the neurons in a neural network gradually become sensitive to
different input pattern clusters. The basic idea behind the Kohonen-s
Self-Organizing Feature Maps (SOFM) is competitive learning.
SOFM can generate mappings from high-dimensional signal spaces
to lower dimensional topological structures. The main features of this
kind of mappings are topology preserving, feature mappings and
probability distribution approximation of input patterns. To overcome
some limitations of SOFM, e.g., a fixed number of neural units and a
topology of fixed dimensionality, Growing Self-Organizing Neural
Network (GSONN) can be used. GSONN can change its topological
structure during learning. It grows by learning and shrinks by
forgetting. To speed up the training and convergence, a new variant
of GSONN, twin growing cell structures (TGCS) is presented here.
This paper first gives an introduction to competitive learning, SOFM
and its variants. Then, we discuss some GSONN with fixed
dimensionality, which include growing cell structures, its variants
and the author-s model: TGCS. It is ended with some testing results
comparison and conclusions.
Abstract: Recently, various services such as television and the
Internet have come to be received through various terminals.
However, we could gain greater convenience by receiving these
services through cellular phone terminals when we go out and then
continuing to receive the same services through a large screen digital
television after we have come home. However, it is necessary to go
through the same authentication processing again when using TVs
after we have come home. In this study, we have developed an
authentication method that enables users to switch terminals in
environments in which the user receives service from a server through
a terminal. Specifically, the method simplifies the authentication of
the server side when switching from one terminal to another terminal
by using previously authenticated information.
Abstract: There are many approaches proposed for solving
Sudoku puzzles. One of them is by modelling the puzzles as block
world problems. There have been three model for Sudoku solvers
based on this approach. Each model expresses Sudoku solver as
a parameterized multi agent systems. In this work, we propose a
new model which is an improvement over the existing models. This
paper presents the development of a Sudoku solver that implements
all the proposed models. Some experiments have been conducted to
determine the performance of each model.
Abstract: This paper presents the design and implementation of
the WebGD, a CORBA-based document classification and retrieval
system on Internet. The WebGD makes use of such techniques as Web,
CORBA, Java, NLP, fuzzy technique, knowledge-based processing
and database technology. Unified classification and retrieval model,
classifying and retrieving with one reasoning engine and flexible
working mode configuration are some of its main features. The
architecture of WebGD, the unified classification and retrieval model,
the components of the WebGD server and the fuzzy inference engine
are discussed in this paper in detail.
Abstract: In mobile environments, unspecified numbers of transactions
arrive in continuous streams. To prove correctness of their
concurrent execution a method of modelling an infinite number of
transactions is needed. Standard database techniques model fixed
finite schedules of transactions. Lately, techniques based on temporal
logic have been proposed as suitable for modelling infinite schedules.
The drawback of these techniques is that proving the basic
serializability correctness condition is impractical, as encoding (the
absence of) conflict cyclicity within large sets of transactions results
in prohibitively large temporal logic formulae. In this paper, we show
that, under certain common assumptions on the graph structure of
data items accessed by the transactions, conflict cyclicity need only
be checked within all possible pairs of transactions. This results in
formulae of considerably reduced size in any temporal-logic-based
approach to proving serializability, and scales to arbitrary numbers
of transactions.
Abstract: User-based Collaborative filtering (CF), one of the
most prevailing and efficient recommendation techniques, provides
personalized recommendations to users based on the opinions of other
users. Although the CF technique has been successfully applied in
various applications, it suffers from serious sparsity problems. The
cloud-model approach addresses the sparsity problems by
constructing the user-s global preference represented by a cloud
eigenvector. The user-based CF approach works well with dense
datasets while the cloud-model CF approach has a greater
performance when the dataset is sparse. In this paper, we present a
hybrid approach that integrates the predictions from both the
user-based CF and the cloud-model CF approaches. The experimental
results show that the proposed hybrid approach can ameliorate the
sparsity problem and provide an improved prediction quality.
Abstract: Recently the use of data mining to scientific bibliographic data bases has been implemented to analyze the pathways of the knowledge or the core scientific relevances of a laureated novel or a country. This specific case of data mining has been named citation mining, and it is the integration of citation bibliometrics and text mining. In this paper we present an improved WEB implementation of statistical physics algorithms to perform the text mining component of citation mining. In particular we use an entropic like distance between the compression of text as an indicator of the similarity between them. Finally, we have included the recently proposed index h to characterize the scientific production. We have used this web implementation to identify users, applications and impact of the Mexican scientific institutions located in the State of Morelos.
Abstract: The purpose of this paper is to detect human in images.
This paper proposes a method for extracting human body feature descriptors consisting of projected edge component series. The feature descriptor can express appearances and shapes of human with local
and global distribution of edges. Our method evaluated with a linear SVM classifier on Daimler-Chrysler pedestrian dataset, and test with
various sub-region size. The result shows that the accuracy level of
proposed method similar to Histogram of Oriented Gradients(HOG)
feature descriptor and feature extraction process is simple and faster than existing methods.
Abstract: This paper proposes a method that predicts attractive
evaluation objects. In the learning phase, the method inductively
acquires trend rules from complex sequential data. The data is
composed of two types of data. One is numerical sequential data.
Each evaluation object has respective numerical sequential data. The
other is text sequential data. Each evaluation object is described in
texts. The trend rules represent changes of numerical values related
to evaluation objects. In the prediction phase, the method applies
new text sequential data to the trend rules and evaluates which
evaluation objects are attractive. This paper verifies the effect of the
proposed method by using stock price sequences and news headline
sequences. In these sequences, each stock brand corresponds to an
evaluation object. This paper discusses validity of predicted attractive
evaluation objects, the process time of each phase, and the possibility
of application tasks.
Abstract: A chord of a simple polygon P is a line segment [xy]
that intersects the boundary of P only at both endpoints x and y. A
chord of P is called an interior chord provided the interior of [xy] lies
in the interior of P. P is weakly visible from [xy] if for every point v
in P there exists a point w in [xy] such that [vw] lies in P. In this
paper star-shaped, L-convex, and convex polygons are characterized
in terms of weak visibility properties from internal chords and starshaped
subsets of P. A new Krasnoselskii-type characterization of
isothetic star-shaped polygons is also presented.
Abstract: This article describes Uruk, the virtual museum of
Iraq that we developed for visual exploration and retrieval of image
collections. The system largely exploits the loosely-structured
hierarchy of XML documents that provides a useful representation
method to store semi-structured or unstructured data, which does not
easily fit into existing database. The system offers users the
capability to mine and manage the XML-based image collections
through a web-based Graphical User Interface (GUI). Typically, at an
interactive session with the system, the user can browse a visual
structural summary of the XML database in order to select interesting
elements. Using this intermediate result, queries combining structure
and textual references can be composed and presented to the system.
After query evaluation, the full set of answers is presented in a visual
and structured way.