Abstract: Lacking an inherent “natural" dissimilarity measure
between objects in categorical dataset presents special difficulties in
clustering analysis. However, each categorical attributes from a given
dataset provides natural probability and information in the sense of
Shannon. In this paper, we proposed a novel method which
heuristically converts categorical attributes to numerical values by
exploiting such associated information. We conduct an experimental
study with real-life categorical dataset. The experiment demonstrates
the effectiveness of our approach.
Abstract: An alternative approach to the use of Discrete Fourier
Transform (DFT) for Magnetic Resonance Imaging (MRI) reconstruction
is the use of parametric modeling technique. This method
is suitable for problems in which the image can be modeled by
explicit known source functions with a few adjustable parameters.
Despite the success reported in the use of modeling technique as an
alternative MRI reconstruction technique, two important problems
constitutes challenges to the applicability of this method, these are
estimation of Model order and model coefficient determination. In
this paper, five of the suggested method of evaluating the model
order have been evaluated, these are: The Final Prediction Error
(FPE), Akaike Information Criterion (AIC), Residual Variance (RV),
Minimum Description Length (MDL) and Hannan and Quinn (HNQ)
criterion. These criteria were evaluated on MRI data sets based on the
method of Transient Error Reconstruction Algorithm (TERA). The
result for each criterion is compared to result obtained by the use of a
fixed order technique and three measures of similarity were evaluated.
Result obtained shows that the use of MDL gives the highest measure
of similarity to that use by a fixed order technique.
Abstract: In the last years numerous applications of Human-
Computer Interaction have exploited the capabilities of Time-of-
Flight cameras for achieving more and more comfortable and precise
interactions. In particular, gesture recognition is one of the most active
fields. This work presents a new method for interacting with a virtual
object in a 3D space. Our approach is based on the fusion of depth
data, supplied by a ToF camera, with color information, supplied
by a HD webcam. The hand detection procedure does not require
any learning phase and is able to concurrently manage gestures of
two hands. The system is robust to the presence in the scene of
other objects or people, thanks to the use of the Kalman filter for
maintaining the tracking of the hands.
Abstract: Network on a chip (NoC) has been proposed as a viable solution to counter the inefficiency of buses in the current VLSI on-chip interconnects. However, as the silicon chip accommodates more transistors, the probability of transient faults is increasing, making fault tolerance a key concern in scaling chips. In packet based communication on a chip, transient failures can corrupt the data packet and hence, undermine the accuracy of data communication. In this paper, we present a comparative analysis of transient fault tolerant techniques including end-to-end, node-by-node, and stochastic communication based on flooding principle.
Abstract: The rapid expansion of the web is causing the
constant growth of information, leading to several problems such as
increased difficulty of extracting potentially useful knowledge. Web
content mining confronts this problem gathering explicit information
from different web sites for its access and knowledge discovery.
Query interfaces of web databases share common building blocks.
After extracting information with parsing approach, we use a new
data mining algorithm to match a large number of schemas in
databases at a time. Using this algorithm increases the speed of
information matching. In addition, instead of simple 1:1 matching,
they do complex (m:n) matching between query interfaces. In this
paper we present a novel correlation mining algorithm that matches
correlated attributes with smaller cost. This algorithm uses Jaccard
measure to distinguish positive and negative correlated attributes.
After that, system matches the user query with different query
interfaces in special domain and finally chooses the nearest query
interface with user query to answer to it.
Abstract: Wireless Sensor Networks consist of small battery
powered devices with limited energy resources. once deployed, the
small sensor nodes are usually inaccessible to the user, and thus
replacement of the energy source is not feasible. Hence, One of the
most important issues that needs to be enhanced in order to improve
the life span of the network is energy efficiency. to overcome this
demerit many research have been done. The clustering is the one of
the representative approaches. in the clustering, the cluster heads
gather data from nodes and sending them to the base station. In this
paper, we introduce a dynamic clustering algorithm using genetic
algorithm. This algorithm takes different parameters into
consideration to increase the network lifetime. To prove efficiency of
proposed algorithm, we simulated the proposed algorithm compared
with LEACH algorithm using the matlab
Abstract: This paper presents a general trainable framework
for fast and robust upright human face and non-human object
detection and verification in static images. To enhance the
performance of the detection process, the technique we develop is
based on the combination of fast neural network (FNN) and
classical neural network (CNN). In FNN, a useful correlation is
exploited to sustain high level of detection accuracy between input
image and the weight of the hidden neurons. This is to enable the
use of Fourier transform that significantly speed up the time
detection. The combination of CNN is responsible to verify the
face region. A bootstrap algorithm is used to collect non human
object, which adds the false detection to the training process of the
human and non-human object. Experimental results on test images
with both simple and complex background demonstrate that the
proposed method has obtained high detection rate and low false
positive rate in detecting both human face and non-human object.
Abstract: Coloured Petri net (CPN) has been widely adopted in various areas in Computer Science, including protocol specification, performance evaluation, distributed systems and coordination in multi-agent systems. It provides a graphical representation of a system and has a strong mathematical foundation for proving various properties. This paper proposes a novel representation of a coloured Petri net using an extension of logic programming called abductive logic programming (ALP), which is purely based on classical logic. Under such a representation, an implementation of a CPN could be directly obtained, in which every inference step could be treated as a kind of equivalence preserved transformation. We would describe how to implement a CPN under such a representation using common meta-programming techniques in Prolog. We call our framework CPN-LP and illustrate its applications in modeling an intelligent agent.
Abstract: Extracting thematic (semantic) roles is one of the
major steps in representing text meaning. It refers to finding the
semantic relations between a predicate and syntactic constituents in a
sentence. In this paper we present a rule-based approach to extract
semantic roles from Persian sentences. The system exploits a twophase
architecture to (1) identify the arguments and (2) label them
for each predicate.
For the first phase we developed a rule based shallow parser to
chunk Persian sentences and for the second phase we developed a
knowledge-based system to assign 16 selected thematic roles to the
chunks. The experimental results of testing each phase are shown at
the end of the paper.
Abstract: A new automatic system for the recognition and re¬construction of resealed and/or rotated partially occluded objects is presented. The objects to be recognized are described by 2D views and each view is occluded by several half-planes. The whole object views and their visible parts (linear cuts) are then stored in a database. To establish if a region R of an input image represents an object possibly occluded, the system generates a set of linear cuts of R and compare them with the elements in the database. Each linear cut of R is associated to the most similar database linear cut. R is recognized as an instance of the object 0 if the majority of the linear cuts of R are associated to a linear cut of views of 0. In the case of recognition, the system reconstructs the occluded part of R and determines the scale factor and the orientation in the image plane of the recognized object view. The system has been tested on two different datasets of objects, showing good performance both in terms of recognition and reconstruction accuracy.
Abstract: A new Markovianity approach is introduced in this
paper. This approach reduces the response time of classic Markov
Random Fields approach. First, one region is determinated by a
clustering technique. Then, this region is excluded from the study.
The remaining pixel form the study zone and they are selected for a
Markovianity segmentation task. With Selective Markovianity
approach, segmentation process is faster than classic one.
Abstract: When a small H/W IP is designed, we can develop an
appropriate verification environment by observing the simulated
signal waves, or using the serial test vectors for the fixed output. In the
case of design and verification of a massive parallel processor with
multiple IPs, it-s difficult to make a verification system with existing
common verification environment, and to verify each partial IP. A
TestDrive verification environment can build easy and reliable
verification system that can produce highly intuitive results by
applying Modelsim and SystemVerilog-s DPI. It shows many
advantages, for example a high-level design of a GPGPU processor
design can be migrate to FPGA board immediately.
Abstract: Resource discovery is one of the chief services of a grid. A new approach to discover the provenances in grid through learning automata has been propounded in this article. The objective of the aforementioned resource-discovery service is to select the resource based upon the user-s applications and the mercantile yardsticks that is to say opting for an originator which can accomplish the user-s tasks in the most economic manner. This novel service is submitted in two phases. We proffered an applicationbased categorization by means of an intelligent nerve-prone plexus. The user in question sets his or her application as the input vector of the nerve-prone nexus. The output vector of the aforesaid network limns the appropriateness of any one of the resource for the presented executive procedure. The most scrimping option out of those put forward in the previous stage which can be coped with to fulfill the task in question is picked out. Te resource choice is carried out by means of the presented algorithm based upon the learning automata.
Abstract: Application of Information Technology (IT) has
revolutionized the functioning of business all over the world. Its
impact has been felt mostly among the information of dependent
industries. Tourism is one of such industry. The conceptual
framework in this study represents an innovation of travel
information searching system on mobile devices which is used as
tools to deliver travel information (such as hotels, restaurants, tourist
attractions and souvenir shops) for each user by travelers
segmentation based on data mining technique to segment the tourists-
behavior patterns then match them with tourism products and
services. This system innovation is designed to be a knowledge
incremental learning. It is a marketing strategy to support business to
respond traveler-s demand effectively.
Abstract: In this paper, the phase control antenna array synthesis
is presented. The problem is formulated as a constrained optimization
problem that imposes nulls with prescribed level while maintaining
the sidelobe at a prescribed level. For efficient use of the algorithm
memory, compared to the well known Particle Swarm Optimization
(PSO), the Accelerated Particle Swarm Optimization (APSO) is used
to estimate the phase parameters of the synthesized array. The
objective function is formed using a main objective and set of
constraints with penalty factors that measure the violation of each
feasible solution in the search space to each constraint. In this case
the obtained feasible solution is guaranteed to satisfy all the
constraints. Simulation results have shown significant performance
increases and a decreased randomness in the parameter search space
compared to a single objective conventional particle swarm
optimization.
Abstract: Methods for organizing web data into groups in order
to analyze web-based hypertext data and facilitate data availability
are very important in terms of the number of documents available
online. Thereby, the task of clustering web-based document structures
has many applications, e.g., improving information retrieval on the
web, better understanding of user navigation behavior, improving web
users requests servicing, and increasing web information accessibility.
In this paper we investigate a new approach for clustering web-based
hypertexts on the basis of their graph structures. The hypertexts will
be represented as so called generalized trees which are more general
than usual directed rooted trees, e.g., DOM-Trees. As a important
preprocessing step we measure the structural similarity between the
generalized trees on the basis of a similarity measure d. Then,
we apply agglomerative clustering to the obtained similarity matrix
in order to create clusters of hypertext graph patterns representing
navigation structures. In the present paper we will run our approach
on a data set of hypertext structures and obtain good results in
Web Structure Mining. Furthermore we outline the application of
our approach in Web Usage Mining as future work.
Abstract: A big organization may have multiple branches spread across different locations. Processing of data from these branches becomes a huge task when innumerable transactions take place. Also, branches may be reluctant to forward their data for centralized processing but are ready to pass their association rules. Local mining may also generate a large amount of rules. Further, it is not practically possible for all local data sources to be of the same size. A model is proposed for discovering valid rules from different sized data sources where the valid rules are high weighted rules. These rules can be obtained from the high frequency rules generated from each of the data sources. A data source selection procedure is considered in order to efficiently synthesize rules. Support Equalization is another method proposed which focuses on eliminating low frequency rules at the local sites itself thus reducing the rules by a significant amount.
Abstract: Ethnicity identification of face images is of interest in
many areas of application, but existing methods are few and limited.
This paper presents a fusion scheme that uses block-based uniform
local binary patterns and Haar wavelet transform to combine local
and global features. In particular, the LL subband coefficients of the
whole face are fused with the histograms of uniform local binary
patterns from block partitions of the face. We applied the principal
component analysis on the fused features and managed to reduce the
dimensionality of the feature space from 536 down to around 15
without sacrificing too much accuracy. We have conducted a number
of preliminary experiments using a collection of 746 subject face
images. The test results show good accuracy and demonstrate the
potential of fusing global and local features. The fusion approach is
robust, making it easy to further improve the identification at both
feature and score levels.
Abstract: A logic model for analyzing complex systems- stability
is very useful to many areas of sciences. In the real world, we are
enlightened from some natural phenomena such as “biosphere", “food
chain", “ecological balance" etc. By research and practice, and taking
advantage of the orthogonality and symmetry defined by the theory of
multilateral matrices, we put forward a logic analysis model of
stability of complex systems with three relations, and prove it by
means of mathematics. This logic model is usually successful in
analyzing stability of a complex system. The structure of the logic
model is not only clear and simple, but also can be easily used to
research and solve many stability problems of complex systems. As an
application, some examples are given.
Abstract: A potentially serious problem with current payment systems is that their underlying hard problems from number theory may be solved by either a quantum computer or unanticipated future advances in algorithms and hardware. A new quantum payment system is proposed in this paper. The suggested system makes use of fundamental principles of quantum mechanics to ensure the unconditional security without prior arrangements between customers and vendors. More specifically, the new system uses Greenberger-Home-Zeilinger (GHZ) states and Quantum Key Distribution to authenticate the vendors and guarantee the transaction integrity.