Abstract: Wireless mobile communications have experienced
the phenomenal growth through last decades. The advances in
wireless mobile technologies have brought about a demand for high
quality multimedia applications and services. For such applications
and services to work, signaling protocol is required for establishing,
maintaining and tearing down multimedia sessions. The Session
Initiation Protocol (SIP) is an application layer signaling protocols,
based on request/response transaction model. This paper considers
SIP INVITE transaction over an unreliable medium, since it has been
recently modified in Request for Comments (RFC) 6026. In order to
help in assuring that the functional correctness of this modification is
achieved, the SIP INVITE transaction is modeled and analyzed using
Colored Petri Nets (CPNs). Based on the model analysis, it is
concluded that the SIP INVITE transaction is free of livelocks and
dead codes, and in the same time it has both desirable and
undesirable deadlocks. Therefore, SIP INVITE transaction should be
subjected for additional updates in order to eliminate undesirable
deadlocks. In order to reduce the cost of implementation and
maintenance of SIP, additional remodeling of the SIP INVITE
transaction is recommended.
Abstract: This paper attempts to establish the fact that Multi
State Network Classification is essential for performance
enhancement of Transport protocols over Satellite based Networks. A
model to classify Multi State network condition taking into
consideration both congestion and channel error is evolved. In order
to arrive at such a model an analysis of the impact of congestion and
channel error on RTT values has been carried out using ns2. The
analysis results are also reported in the paper. The inference drawn
from this analysis is used to develop a novel statistical RTT based
model for multi state network classification.
An Adaptive Multi State Proactive Transport Protocol consisting
of Proactive Slow Start, State based Error Recovery, Timeout Action
and Proactive Reduction is proposed which uses the multi state
network state classification model. This paper also confirms through
detail simulation and analysis that a prior knowledge about the
overall characteristics of the network helps in enhancing the
performance of the protocol over satellite channel which is
significantly affected due to channel noise and congestion.
The necessary augmentation of ns2 simulator is done for
simulating the multi state network classification logic. This
simulation has been used in detail evaluation of the protocol under
varied levels of congestion and channel noise. The performance
enhancement of this protocol with reference to established protocols
namely TCP SACK and Vegas has been discussed. The results as
discussed in this paper clearly reveal that the proposed protocol
always outperforms its peers and show a significant improvement in
very high error conditions as envisaged in the design of the protocol.
Abstract: Clustering large populations is an important problem
when the data contain noise and different shapes. A good clustering
algorithm or approach should be efficient enough to detect clusters
sensitively. Besides space complexity, time complexity also gains
importance as the size grows. Using hierarchies we developed a new
algorithm to split attributes according to the values they have and
choosing the dimension for splitting so as to divide the database
roughly into equal parts as much as possible. At each node we
calculate some certain descriptive statistical features of the data
which reside and by pruning we generate the natural clusters with a
complexity of O(n).
Abstract: With major technological advances and to reduce the
cost of training apprentices for real-time critical systems, it was
necessary the development of Intelligent Tutoring Systems for
training apprentices in these systems. These systems, in general, have
interactive features so that the learning is actually more efficient,
making the learner more familiar with the mechanism in question. In
the home stage of learning, tests are performed to obtain the student's
income, a measure on their use. The aim of this paper is to present a
framework to model an Intelligent Tutoring Systems using the UML
language. The various steps of the analysis are considered the
diagrams required to build a general model, whose purpose is to
present the different perspectives of its development.
Abstract: This paper presents the decoder design for the single error correcting and double error detecting code proposed by the authors in an earlier paper. The speed of error detection and correction of a code is largely dependent upon the associated encoder and decoder circuits. The complexity and the speed of such circuits are determined by the number of 1?s in the parity check matrix (PCM). The number of 1?s in the parity check matrix for the code proposed by the authors are fewer than in any currently known single error correcting/double error detecting code. This results in simplified encoding and decoding circuitry for error detection and correction.
Abstract: This paper proposes a low power SRAM based on
five transistor SRAM cell. Proposed SRAM uses novel word-line
decoding such that, during read/write operation, only selected cell
connected to bit-line whereas, in conventional SRAM (CV-SRAM),
all cells in selected row connected to their bit-lines, which in turn
develops differential voltages across all bit-lines, and this makes
energy consumption on unselected bit-lines. In proposed SRAM
memory array divided into two halves and this causes data-line
capacitance to reduce. Also proposed SRAM uses one bit-line and
thus has lower bit-line leakage compared to CV-SRAM.
Furthermore, the proposed SRAM incurs no area overhead, and has
comparable read/write performance versus the CV-SRAM.
Simulation results in standard 0.25μm CMOS technology shows in
worst case proposed SRAM has 80% smaller dynamic energy
consumption in each cycle compared to CV-SRAM. Besides, energy
consumption in each cycle of proposed SRAM and CV-SRAM
investigated analytically, the results of which are in good agreement
with the simulation results.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: In this paper, we develop a Spatio-Temporal graph as
of a key component of our knowledge representation Scheme. We
design an integrated representation Scheme to depict not only present
and past but future in parallel with the spaces in an effective and
intuitive manner. The resulting multi-dimensional comprehensive
knowledge structure accommodates multi-layered virtual world
developing in the time to maximize the diversity of situations in the
historical context. This knowledge representation Scheme is to be used
as the basis for simulation of situations composing the virtual world
and for implementation of virtual agents' knowledge used to judge and
evaluate the situations in the virtual world. To provide natural contexts
for situated learning or simulation games, the virtual stage set by this
Spatio-Temporal graph is to be populated by agents and other objects
interrelated and changing which are abstracted in the ontology.
Abstract: This paper covers various aspects of the Internet film
piracy. In order to successfully deal with this matter, it is needed to
recognize and explain various motivational factors related to film
piracy. Thus, this study proposes groups of economical, sociopsychological
and other factors that could motivate individuals
to engage in pirate activities. The paper also studies the interactions
between downloaders and uploaders and offers the causality of the
motivational factors and its effects on the film industry.
Moreover, the study also focuses on proposed scheme of relations
of downloading movies and the possible effect on box office
revenues.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: Evolutionary Programming (EP) represents a
methodology of Evolutionary Algorithms (EA) in which mutation is
considered as a main reproduction operator. This paper presents a
novel EP approach for Artificial Neural Networks (ANN) learning.
The proposed strategy consists of two components: the self-adaptive,
which contains phenotype information and the dynamic, which is
described by genotype. Self-adaptation is achieved by the addition of
a value, called the network weight, which depends on a total number
of hidden layers and an average number of neurons in hidden layers.
The dynamic component changes its value depending on the fitness
of a chromosome, exposed to mutation. Thus, the mutation step size
is controlled by two components, encapsulated in the algorithm,
which adjust it according to the characteristics of a predefined ANN
architecture and the fitness of a particular chromosome. The
comparative analysis of the proposed approach and the classical EP
(Gaussian mutation) showed, that that the significant acceleration of
the evolution process is achieved by using both phenotype and
genotype information in the mutation strategy.
Abstract: A recent neurospiking coding scheme for feature extraction from biosonar echoes of various plants is examined with avariety of stochastic classifiers. Feature vectors derived are employedin well-known stochastic classifiers, including nearest-neighborhood,single Gaussian and a Gaussian mixture with EM optimization.Classifiers' performances are evaluated by using cross-validation and bootstrapping techniques. It is shown that the various classifers perform equivalently and that the modified preprocessing configuration yields considerably improved results.
Abstract: 2007 is a jubilee year: in 1967, programming language SIMULA 67 was presented, which contained all aspects of what was later called object-oriented programming. The present paper contains a description of the development unto the objectoriented programming, the role of simulation in this development and other tools that appeared in SIMULA 67 and that are nowadays called super-object-oriented programming.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: Designing, implementing, and debugging concurrency
control algorithms in a real system is a complex, tedious, and errorprone
process. Further, understanding concurrency control
algorithms and distributed computations is itself a difficult task.
Visualization can help with both of these problems. Thus, we have
developed an exploratory environment in which people can prototype
and test various versions of concurrency control algorithms, study
and debug distributed computations, and view performance statistics
of distributed systems. In this paper, we describe the exploratory
environment and show how it can be used to explore concurrency
control algorithms for the interactive steering of distributed
computations.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Ant Colony Algorithms have been applied to difficult
combinatorial optimization problems such as the travelling salesman
problem and the quadratic assignment problem. In this paper gridbased
and random-based ant colony algorithms are proposed for
automatic 3D hose routing and their pros and cons are discussed. The
algorithm uses the tessellated format for the obstacles and the
generated hoses in order to detect collisions. The representation of
obstacles and hoses in the tessellated format greatly helps the
algorithm towards handling free-form objects and speeds up
computation. The performance of algorithm has been tested on a
number of 3D models.
Abstract: Instead of traditional (nominal) classification we investigate
the subject of ordinal classification or ranking. An enhanced
method based on an ensemble of Support Vector Machines (SVM-s)
is proposed. Each binary classifier is trained with specific weights
for each object in the training data set. Experiments on benchmark
datasets and synthetic data indicate that the performance of our
approach is comparable to state of the art kernel methods for
ordinal regression. The ensemble method, which is straightforward
to implement, provides a very good sensitivity-specificity trade-off
for the highest and lowest rank.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: Robots- visual perception is a field that is gaining
increasing attention from researchers. This is partly due to emerging
trends in the commercial availability of 3D scanning systems or
devices that produce a high information accuracy level for a variety of
applications. In the history of mining, the mortality rate of mine workers
has been alarming and robots exhibit a great deal of potentials to
tackle safety issues in mines. However, an effective vision system
is crucial to safe autonomous navigation in underground terrains.
This work investigates robots- perception in underground terrains
(mines and tunnels) using statistical region merging (SRM) model.
SRM reconstructs the main structural components of an imagery
by a simple but effective statistical analysis. An investigation is
conducted on different regions of the mine, such as the shaft, stope
and gallery, using publicly available mine frames, with a stream of
locally captured mine images. An investigation is also conducted on a
stream of underground tunnel image frames, using the XBOX Kinect
3D sensors. The Kinect sensors produce streams of red, green and
blue (RGB) and depth images of 640 x 480 resolution at 30 frames per
second. Integrating the depth information to drivability gives a strong
cue to the analysis, which detects 3D results augmenting drivable and
non-drivable regions in 2D. The results of the 2D and 3D experiment
with different terrains, mines and tunnels, together with the qualitative
and quantitative evaluation, reveal that a good drivable region can be
detected in dynamic underground terrains.