Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.
Abstract: This paper proposes an innovative methodology for
Acceptance Sampling by Variables, which is a particular category of
Statistical Quality Control dealing with the assurance of products
quality. Our contribution lies in the exploitation of machine learning
techniques to address the complexity and remedy the drawbacks of
existing approaches. More specifically, the proposed methodology
exploits Artificial Neural Networks (ANNs) to aid decision making
about the acceptance or rejection of an inspected sample. For any
type of inspection, ANNs are trained by data from corresponding
tables of a standard-s sampling plan schemes. Once trained, ANNs
can give closed-form solutions for any acceptance quality level and
sample size, thus leading to an automation of the reading of the
sampling plan tables, without any need of compromise with the
values of the specific standard chosen each time. The proposed
methodology provides enough flexibility to quality control engineers
during the inspection of their samples, allowing the consideration of
specific needs, while it also reduces the time and the cost required for
these inspections. Its applicability and advantages are demonstrated
through two numerical examples.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.
Abstract: Decisions are regularly made during a project or
daily life. Some decisions are critical and have a direct impact on
project or human success. Formal evaluation is thus required,
especially for crucial decisions, to arrive at the optimal solution
among alternatives to address issues. According to microeconomic
theory, all people-s decisions can be modeled as indifference curves.
The proposed approach supports formal analysis and decision by
constructing indifference curve model from the previous experts-
decision criteria. These knowledge embedded in the system can be
reused or help naïve users select alternative solution of the similar
problem. Moreover, the method is flexible to cope with unlimited
number of factors influencing the decision-making. The preliminary
experimental results of the alternative selection are accurately
matched with the expert-s decisions.
Abstract: Attitude Determination (AD) of a spacecraft using the
phase measurements of the Global Navigation Satellite System
(GNSS) is an active area of research. Various attitude determination
algorithms have been developed in yester years for spacecrafts using
different sensors but the last two decades have witnessed a
phenomenal increase in research related with GPS receivers as a
stand-alone sensor for determining the attitude of satellite using the
phase measurements of the signals from GNSS. The GNSS-based
Attitude determination algorithms have been experimented in many
real missions. The problem of AD algorithms using GNSS phase
measurements has two important parts; the ambiguity resolution and
the determining of attitude. Ambiguity resolution is the widely
addressed topic in literature for implementing the AD algorithm
using GNSS phase measurements for achieving the accuracy of
millimeter level. This paper broadly overviews the different
techniques for resolving the integer ambiguities encountered in AD
using GNSS phase measurements.
Abstract: Multi-user interference (MUI) is the main reason of system deterioration in the Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) system. MUI increases with the number of simultaneous users, resulting into higher probability bit rate and limits the maximum number of simultaneous users. On the other hand, Phase induced intensity noise (PIIN) problem which is originated from spontaneous emission of broad band source from MUI severely limits the system performance should be addressed as well. Since the MUI is caused by the interference of simultaneous users, reducing the MUI value as small as possible is desirable. In this paper, an extensive study for the system performance specified by MUI and PIIN reducing is examined. Vectors Combinatorial (VC) codes families are adopted as a signature sequence for the performance analysis and a comparison with reported codes is performed. The results show that, when the received power increases, the PIIN noise for all the codes increases linearly. The results also show that the effect of PIIN can be minimized by increasing the code weight leads to preserve adequate signal to noise ratio over bit error probability. A comparison study between the proposed code and the existing codes such as Modified frequency hopping (MFH), Modified Quadratic- Congruence (MQC) has been carried out.
Abstract: User-based Collaborative filtering (CF), one of the
most prevailing and efficient recommendation techniques, provides
personalized recommendations to users based on the opinions of other
users. Although the CF technique has been successfully applied in
various applications, it suffers from serious sparsity problems. The
cloud-model approach addresses the sparsity problems by
constructing the user-s global preference represented by a cloud
eigenvector. The user-based CF approach works well with dense
datasets while the cloud-model CF approach has a greater
performance when the dataset is sparse. In this paper, we present a
hybrid approach that integrates the predictions from both the
user-based CF and the cloud-model CF approaches. The experimental
results show that the proposed hybrid approach can ameliorate the
sparsity problem and provide an improved prediction quality.
Abstract: This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.
Abstract: The article is devoted to Kazakh repatriates and their
migration to Kazakhstan as historical homeland, and also addresses
the problem of migrants- adaptation in the republic, particularly in
Almaty oblast (region). The authors used up-to-date statictics and
materials of the Department of Migration Committee to analyze the
newcomers- number and features of the repatriate-s location in this
oblast. Having studied this region they were able to identify the main
reasons why Kazakh Diaspora in Central Asia, Iran, Avganistana and
Turkey is eager to come back to their historic homeland along with
repatriates adaptation to the republic.
Abstract: The model-based approach to user interface design
relies on developing separate models capturing various aspects about
users, tasks, application domain, presentation and dialog structures.
This paper presents a task modeling approach for user interface
design and aims at exploring mappings between task, domain and
presentation models. The basic idea of our approach is to identify
typical configurations in task and domain models and to investigate
how they relate each other. A special emphasis is put on applicationspecific
functions and mappings between domain objects and
operational task structures. In this respect, we will address two
layers in task decomposition: a functional (planning) layer and an
operational layer.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.
Abstract: This paper presents key challenges reported by a
group of Australian undergraduate Physical Education students in
conducting a program for persons with an intellectual disability.
Strategies adopted to address these challenges are presented together
with representative feedback given by the Physical Education
students at the completion of the program. The significance of the
program’s findings is summarized.
Abstract: While compressing text files is useful, compressing
still image files is almost a necessity. A typical image takes up much
more storage than a typical text message and without compression
images would be extremely clumsy to store and distribute. The
amount of information required to store pictures on modern
computers is quite large in relation to the amount of bandwidth
commonly available to transmit them over the Internet and
applications. Image compression addresses the problem of reducing
the amount of data required to represent a digital image. Performance
of any image compression method can be evaluated by measuring the
root-mean-square-error & peak signal to noise ratio. The method of
image compression that will be analyzed in this paper is based on the
lossy JPEG image compression technique, the most popular
compression technique for color images. JPEG compression is able to
greatly reduce file size with minimal image degradation by throwing
away the least “important" information. In JPEG, both color
components are downsampled simultaneously, but in this paper we
will compare the results when the compression is done by
downsampling the single chroma part. In this paper we will
demonstrate more compression ratio is achieved when the
chrominance blue is downsampled as compared to downsampling the
chrominance red in JPEG compression. But the peak signal to noise
ratio is more when the chrominance red is downsampled as compared
to downsampling the chrominance blue in JPEG compression. In
particular we will use the hats.jpg as a demonstration of JPEG
compression using low pass filter and demonstrate that the image is
compressed with barely any visual differences with both methods.
Abstract: This paper addresses issues of integral steering of
vehicles with two steering axles, where the rear wheels are pivoted in
the direction of the front wheels, but also in the opposite direction.
The steering box of the rear axle is presented with simple linkages
(single contour) that correlate the pivoting of the rear wheels
according to the direction of the front wheels, respectively to the
rotation angle of the steering wheel. The functionality of the system
is analyzed – the extent to which the requirements of the integral
steering are met by the considered/proposed mechanisms. The paper
highlights the quality of the single contour linkages, with two driving
elements for meeting these requirements, emphasizing diagrams of
mechanisms with 2 driving elements. Cam variants are analyzed and
proposed for the rear axle steering box. Cam profiles are determined
by various factors.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: This paper addresses one important aspect of
combustion system analysis, the spray evaporation and
dispersion modeling. In this study we assume an empty
cylinder which is as a simulator for a ramjet engine and the
cylinder has been studied by cold flow. Four nozzles have the
duties of injection which are located in the entrance of
cylinder. The air flow comes into the cylinder from one side
and injection operation will be done. By changing injection
velocity and entrance air flow velocity, we have studied
droplet sizing and efficient mass fraction of fuel vapor near
and at the exit area. We named the mass of fuel vapor inside
the flammability limit as the efficient mass fraction. Further,
we decreased the initial temperature of fuel droplets and we
have repeated the investigating again. To fulfill the calculation
we used a modified version of KIVA-3V.
Abstract: The direct implementation of interleaver functions
in WiMAX is not hardware efficient due to presence of complex
functions. Also the conventional method i.e. using memories for
storing the permutation tables is silicon consuming. This work
presents a 2-D transformation for WiMAX channel interleaver
functions which reduces the overall hardware complexity to
compute the interleaver addresses on the fly. A fully reconfigurable
architecture for address generation in WiMAX
channel interleaver is presented, which consume 1.1 k-gates in
total. It can be configured for any block size and any modulation
scheme in WiMAX. The presented architecture can run at a
frequency of 200 MHz, thus fully supporting high bandwidth
requirements for WiMAX.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: In this paper we present a novel approach for human
Body configuration based on the Silhouette. We propose to address
this problem under the Bayesian framework. We use an effective
Model based MCMC (Markov Chain Monte Carlo) method to solve
the configuration problem, in which the best configuration could be
defined as MAP (maximize a posteriori probability) in Bayesian
model. This model based MCMC utilizes the human body model to
drive the MCMC sampling from the solution space. It converses the
original high dimension space into a restricted sub-space constructed
by the human model and uses a hybrid sampling algorithm. We
choose an explicit human model and carefully select the likelihood
functions to represent the best configuration solution. The
experiments show that this method could get an accurate
configuration and timesaving for different human from multi-views.