Abstract: The main mission of Ezilla is to provide a friendly
interface to access the virtual machine and quickly deploy the high
performance computing environment. Ezilla has been developed by
Pervasive Computing Team at National Center for High-performance
Computing (NCHC). Ezilla integrates the Cloud middleware,
virtualization technology, and Web-based Operating System (WebOS)
to form a virtual computer in distributed computing environment. In
order to upgrade the dataset and speedup, we proposed the sensor
observation system to deal with a huge amount of data in the
Cassandra database. The sensor observation system is based on the
Ezilla to store sensor raw data into distributed database. We adopt the
Ezilla Cloud service to create virtual machines and login into virtual
machine to deploy the sensor observation system. Integrating the
sensor observation system with Ezilla is to quickly deploy experiment
environment and access a huge amount of data with distributed
database that support the replication mechanism to protect the data
security.
Abstract: Random Oracle Model (ROM) is an effective method
for measuring the practical security of cryptograph. In this paper, we
try to use it into information hiding system (IHS). Because IHS has its
own properties, the ROM must be modified if it is used into IHS.
Firstly, we fully discuss why and how to modify each part of ROM
respectively. The main changes include: 1) Divide the attacks that IHS
may be suffered into two phases and divide the attacks of each phase
into several kinds. 2) Distinguish Oracles and Black-boxes clearly. 3)
Define Oracle and four Black-boxes that IHS used. 4) Propose the
formalized adversary model. And 5) Give the definition of judge.
Secondly, based on ROM of IHS, the security against known original
cover attack (KOCA-KOCA-security) is defined. Then, we give an
actual information hiding scheme and prove that it is
KOCA-KOCA-secure. Finally, we conclude the paper and propose the
open problems of further research.
Abstract: This paper presents Faults Forecasting System (FFS)
that utilizes statistical forecasting techniques in analyzing process
variables data in order to forecast faults occurrences. FFS is
proposing new idea in detecting faults. Current techniques used in
faults detection are based on analyzing the current status of the
system variables in order to check if the current status is fault or not.
FFS is using forecasting techniques to predict future timing for faults
before it happens. Proposed model is applying subset modeling
strategy and Bayesian approach in order to decrease dimensionality
of the process variables and improve faults forecasting accuracy. A
practical experiment, designed and implemented in Okayama
University, Japan, is implemented, and the comparison shows that
our proposed model is showing high forecasting accuracy and
BEFORE-TIME.
Abstract: Wavelet transform provides several important
characteristics which can be used in a texture analysis and
classification. In this work, an efficient texture classification method,
which combines concepts from wavelet and co-occurrence matrices,
is presented. An Euclidian distance classifier is used to evaluate the
various methods of classification. A comparative study is essential to
determine the ideal method. Using this conjecture, we developed a
novel feature set for texture classification and demonstrate its
effectiveness
Abstract: The neural network's performance can be measured by efficiency and accuracy. The major disadvantages of neural network approach are that the generalization capability of neural networks is often significantly low, and it may take a very long time to tune the weights in the net to generate an accurate model for a highly complex and nonlinear systems. This paper presents a novel Neuro-fuzzy architecture based on Extended Kalman filter. To test the performance and applicability of the proposed neuro-fuzzy model, simulation study of nonlinear complex dynamic system is carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction of financial time series. A benchmark case studie is used to demonstrate that the proposed model is a superior neuro-fuzzy modeling technique.
Abstract: Real-time 3D applications have to guarantee
interactive rendering speed. There is a restriction for the number of
polygons which is rendered due to performance of a graphics hardware
or graphics algorithms. Generally, the rendering performance will be
drastically increased when handling only the dynamic 3d models,
which is much fewer than the static ones. Since shapes and colors of
the static objects don-t change when the viewing direction is fixed, the
information can be reused. We render huge amounts of polygon those
cannot handled by conventional rendering techniques in real-time by
using a static object image and merging it with rendering result of the
dynamic objects. The performance must be decreased as a
consequence of updating the static object image including removing
an static object that starts to move, re-rending the other static objects
being overlapped by the moving ones. Based on visibility of the object
beginning to move, we can skip the updating process. As a result, we
enhance rendering performance and reduce differences of rendering
speed between each frame. Proposed method renders total
200,000,000 polygons that consist of 500,000 dynamic polygons and
the rest are static polygons in about 100 frames per second.
Abstract: This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.
Abstract: In this paper, we propose a novel adaptive voltage control strategy for boost converter via Inverse LQ Servo-Control. Our presented strategy is based on an analytical formula of Inverse Linear Quadratic (ILQ) design method, which is not necessary to solve Riccati’s equation directly. The optimal and adaptive controller of the voltage control system is designed. The stability and the robust control are analyzed. Whereas, we can get the analytical solution for the optimal and robust voltage control is achieved through the natural angular velocity within a single parameter and we can change the responses easily via the ILQ control theory. Our method provides effective results as the stable responses and the response times are not drifted even if the condition is changed widely.
Abstract: In pattern recognition applications the low level
segmentation and the high level object recognition are generally
considered as two separate steps. The paper presents a method that
bridges the gap between the low and the high level object
recognition. It is based on a Bayesian network representation and
network propagation algorithm. At the low level it uses hierarchical
structure of quadratic spline wavelet image bases. The method is
demonstrated for a simple circuit diagram component identification
problem.
Abstract: We try to give a solution of version control for
documents in web service, that-s why we propose a new approach
used specially for the XML documents. The new approach is applied
in a centralized repository, this repository coexist with other
repositories in a decentralized system. To achieve the activities of
this approach in a standard model we use the ECA active rules. We
also show how the Event-Condition-Action rules (ECA rules) have
been incorporated as a mechanism for the version control of
documents. The need to integrate ECA rules is that it provides a clear
declarative semantics and induces an immediate operational
realization in the system without the need for human intervention.
Abstract: We present analysis of spatial patterns of generic
disease spread simulated by a stochastic long-range correlation SIR
model, where individuals can be infected at long distance in a power
law distribution. We integrated various tools, namely perimeter,
circularity, fractal dimension, and aggregation index to characterize
and investigate spatial pattern formations. Our primary goal was to
understand for a given model of interest which tool has an advantage
over the other and to what extent. We found that perimeter and
circularity give information only for a case of strong correlation–
while the fractal dimension and aggregation index exhibit the growth
rule of pattern formation, depending on the degree of the correlation
exponent (β). The aggregation index method used as an alternative
method to describe the degree of pathogenic ratio (α). This study may
provide a useful approach to characterize and analyze the pattern
formation of epidemic spreading
Abstract: Ant colony based routing algorithms are known to
grantee the packet delivery, but they suffer from the huge overhead
of control messages which are needed to discover the route. In this
paper we utilize the network nodes positions to group the nodes
in connected clusters. We use clusters-heads only on forwarding
the route discovery control messages. Our simulations proved that
the new algorithm has decreased the overhead dramatically without
affecting the delivery rate.
Abstract: In this paper we present a new method for coin
identification. The proposed method adopts a hybrid scheme using
Eigenvalues of covariance matrix, Circular Hough Transform (CHT)
and Bresenham-s circle algorithm. The statistical and geometrical
properties of the small and large Eigenvalues of the covariance
matrix of a set of edge pixels over a connected region of support are
explored for the purpose of circular object detection. Sparse matrix
technique is used to perform CHT. Since sparse matrices squeeze
zero elements and contain only a small number of non-zero elements,
they provide an advantage of matrix storage space and computational
time. Neighborhood suppression scheme is used to find the valid
Hough peaks. The accurate position of the circumference pixels is
identified using Raster scan algorithm which uses geometrical
symmetry property. After finding circular objects, the proposed
method uses the texture on the surface of the coins called texton,
which are unique properties of coins, refers to the fundamental micro
structure in generic natural images. This method has been tested on
several real world images including coin and non-coin images. The
performance is also evaluated based on the noise withstanding
capability.
Abstract: Classification is one of the primary themes in
computational biology. The accuracy of classification strongly
depends on quality of a dataset, and we need some method to
evaluate this quality. In this paper, we propose a new graphical
analysis method using 'Membership-Deviation Graph (MDG)' for
analyzing quality of a dataset. MDG represents degree of
membership and deviations for instances of a class in the dataset. The
result of MDG analysis is used for understanding specific feature and
for selecting best feature for classification.
Abstract: Current tools for data migration between documentoriented
and relational databases have several disadvantages. We
propose a new approach for data migration between documentoriented
and relational databases. During data migration the relational
schema of the target (relational database) is automatically created
from collection of XML documents. Proposed approach is verified on
data migration between document-oriented database IBM Lotus/
Notes Domino and relational database implemented in relational
database management system (RDBMS) MySQL.
Abstract: In this work a novel approach for color image
segmentation using higher order entropy as a textural feature for
determination of thresholds over a two dimensional image histogram
is discussed. A similar approach is applied to achieve multi-level
thresholding in both grayscale and color images. The paper discusses
two methods of color image segmentation using RGB space as the
standard processing space. The threshold for segmentation is decided
by the maximization of conditional entropy in the two dimensional
histogram of the color image separated into three grayscale images of
R, G and B. The features are first developed independently for the
three ( R, G, B ) spaces, and combined to get different color
component segmentation. By considering local maxima instead of the
maximum of conditional entropy yields multiple thresholds for the
same image which forms the basis for multilevel thresholding.
Abstract: This paper proposes a technique to protect against
email bombing. The technique employs a statistical approach, Naïve
Bayes (NB), and Neural Networks to show that it is possible to
differentiate between good and bad traffic to protect against email
bombing attacks. Neural networks and Naïve Bayes can be trained
by utilizing many email messages that include both input and output
data for legitimate and non-legitimate emails. The input to the model
includes the contents of the body of the messages, the subject, and
the headers. This information will be used to determine if the email
is normal or an attack email. Preliminary tests suggest that Naïve
Bayes can be trained to produce an accurate response to confirm
which email represents an attack.
Abstract: The Linear discriminant analysis (LDA) can be
generalized into a nonlinear form - kernel LDA (KLDA) expediently
by using the kernel functions. But KLDA is often referred to a general
eigenvalue problem in singular case. To avoid this complication, this
paper proposes an iterative algorithm for the two-class KLDA. The
proposed KLDA is used as a nonlinear discriminant classifier, and the
experiments show that it has a comparable performance with SVM.
Abstract: Histogram plays an important statistical role in digital
image processing. However, the existing quantum image models are
deficient to do this kind of image statistical processing because
different gray scales are not distinguishable. In this paper, a novel
quantum image representation model is proposed firstly in which the
pixels with different gray scales can be distinguished and operated
simultaneously. Based on the new model, a fast quantum algorithm of
constructing histogram for quantum image is designed. Performance
comparison reveals that the new quantum algorithm could achieve an
approximately quadratic speedup than the classical counterpart. The
proposed quantum model and algorithm have significant meanings for
the future researches of quantum image processing.
Abstract: This paper proposes view-point insensitive human
pose recognition system using neural network. Recognition system
consists of silhouette image capturing module, data driven database,
and neural network. The advantages of our system are first, it is
possible to capture multiple view-point silhouette images of 3D human
model automatically. This automatic capture module is helpful to
reduce time consuming task of database construction. Second, we
develop huge feature database to offer view-point insensitivity at pose
recognition. Third, we use neural network to recognize human pose
from multiple-view because every pose from each model have similar
feature patterns, even though each model has different appearance and
view-point. To construct database, we need to create 3D human model
using 3D manipulate tools. Contour shape is used to convert silhouette
image to feature vector of 12 degree. This extraction task is processed
semi-automatically, which benefits in that capturing images and
converting to silhouette images from the real capturing environment is
needless. We demonstrate the effectiveness of our approach with
experiments on virtual environment.