Abstract: QoS Routing aims to find paths between senders and
receivers satisfying the QoS requirements of the application which
efficiently using the network resources and underlying routing
algorithm to be able to find low-cost paths that satisfy given QoS
constraints. The problem of finding least-cost routing is known to be
NP hard or complete and some algorithms have been proposed to
find a near optimal solution. But these heuristics or algorithms either
impose relationships among the link metrics to reduce the complexity
of the problem which may limit the general applicability of the
heuristic, or are too costly in terms of execution time to be applicable
to large networks. In this paper, we analyzed two algorithms namely
Characterized Delay Constrained Routing (CDCR) and Optimized
Delay Constrained Routing (ODCR). The CDCR algorithm dealt an
approach for delay constrained routing that captures the trade-off
between cost minimization and risk level regarding the delay
constraint. The ODCR which uses an adaptive path weight function
together with an additional constraint imposed on the path cost, to
restrict search space and hence ODCR finds near optimal solution in
much quicker time.
Abstract: The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.
Abstract: In this paper, the performance of two adaptive
observers applied to interconnected systems is studied. The
nonlinearity of systems can be written in a fractional form. The first
adaptive observer is an adaptive sliding mode observer for a Lipchitz
nonlinear system and the second one is an adaptive sliding mode
observer having a filtered error as a sliding surface. After comparing
their performances throughout the inverted pendulum mounted on a
car system, it was shown that the second one is more robust to
estimate the state.
Abstract: This paper presents an adaptive feedback linearization approach to derive helicopter. Ideal feedback linearization is defined for the cases when the system model is known. Adaptive feedback linearization is employed to get asymptotically exact cancellation for the inherent uncertainty in the knowledge of the given parameters of system. The control algorithm is implemented using the feedback linearization technique and adaptive method. The controller parameters are unknown where an adaptive control law aims to drive them towards their ideal values for providing perfect model matching between the reference model and the closed-loop plant model. The converged parameters of controller would then provide good estimates for the unknown plant parameters.
Abstract: The mobile systems are powered by batteries.
Reducing the system power consumption is a key to increase its
autonomy. It is known that mostly the systems are dealing with time
varying signals. Thus, we aim to achieve power efficiency by smartly
adapting the system processing activity in accordance with the input
signal local characteristics. It is done by completely rethinking the
processing chain, by adopting signal driven sampling and processing.
In this context, a signal driven filtering technique, based on the level
crossing sampling is devised. It adapts the sampling frequency and
the filter order by analysing the input signal local variations. Thus, it
correlates the processing activity with the signal variations. It leads
towards a drastic computational gain of the proposed technique
compared to the classical one.
Abstract: This study aims to conduct a preliminary investigation to determine the topic to be focused in developing Virtual Laboratory For Biology (VLab-Bio). Samples involved in answering the questionnaire are form five students (equivalent to A-Level) and biology teachers. Time and economical resources for the setting up and construction of scientific laboratories can be solved with the adaptation of virtual laboratories as an educational tool. Thus, it is hoped that the proposed virtual laboratory will help students to learn the abstract concepts in biology. Findings show that the difficult topic chosen is Cell Division and the learning objective to be focused in developing the virtual lab is “Describe the application of knowledge on mitosis in cloning".
Abstract: In the last decade digital watermarking procedures have
become increasingly applied to implement the copyright protection
of multimedia digital contents distributed on the Internet. To this
end, it is worth noting that a lot of watermarking procedures
for images and videos proposed in literature are based on spread
spectrum techniques. However, some scepticism about the robustness
and security of such watermarking procedures has arisen because
of some documented attacks which claim to render the inserted
watermarks undetectable. On the other hand, web content providers
wish to exploit watermarking procedures characterized by flexible and
efficient implementations and which can be easily integrated in their
existing web services frameworks or platforms. This paper presents
how a simple spread spectrum watermarking procedure for MPEG-2
videos can be modified to be exploited in web contexts. To this end,
the proposed procedure has been made secure and robust against some
well-known and dangerous attacks. Furthermore, its basic scheme
has been optimized by making the insertion procedure adaptive with
respect to the terminals used to open the videos and the network transactions
carried out to deliver them to buyers. Finally, two different
implementations of the procedure have been developed: the former
is a high performance parallel implementation, whereas the latter is
a portable Java and XML based implementation. Thus, the paper
demonstrates that a simple spread spectrum watermarking procedure,
with limited and appropriate modifications to the embedding scheme,
can still represent a valid alternative to many other well-known and
more recent watermarking procedures proposed in literature.
Abstract: In this paper, a model for an information retrieval
system is proposed which takes into account that knowledge about
documents and information need of users are dynamic. Two
methods are combined, one qualitative or symbolic and the other
quantitative or numeric, which are deemed suitable for many
clustering contexts, data analysis, concept exploring and
knowledge discovery. These two methods may be classified as
inductive learning techniques. In this model, they are introduced to
build “long term" knowledge about past queries and concepts in a
collection of documents. The “long term" knowledge can guide
and assist the user to formulate an initial query and can be
exploited in the process of retrieving relevant information. The
different kinds of knowledge are organized in different points of
view. This may be considered an enrichment of the exploration
level which is coherent with the concept of document/query
structure.
Abstract: Heating systems are a necessity for regions which
brace extreme cold weather throughout the year. To maintain a comfortable temperature inside a given place, heating systems
making use of- Hydronic boilers- are used. The principle of a single
pipe system serves as a base for their working. It is mandatory for these heating systems to control the room temperature, thus
maintaining a warm environment. In this paper, the concept of regulation of the room temperature over a wide range is established
by using an Adaptive Fuzzy Controller (AFC). This fuzzy controller automatically detects the changes in the outside temperatures and
correspondingly maintains the inside temperature to a palatial value. Two separate AFC's are put to use to carry out this function: one to
determine the quantity of heat needed to reach the prospective temperature required and to set the desired temperature; the other to control the position of the valve, which is directly proportional to the
error between the present room temperature and the user desired temperature. The fuzzy logic controls the position of the valve as per
the requirement of the heat. The amount by which the valve opens or closes is controlled by 5 knob positions, which vary from minimum to maximum, thereby regulating the amount of heat flowing through the valve. For the given test system data, different de-fuzzifier
methods have been implemented and the results are compared. In order to validate the effectiveness of the proposed approach, a fuzzy controller has been designed by obtaining a test data from a real time
system. The simulations are performed in MATLAB and are verified with standard system data. The proposed approach can be implemented for real time applications.
Abstract: Protein residue contact map is a compact
representation of secondary structure of protein. Due to the
information hold in the contact map, attentions from researchers in
related field were drawn and plenty of works have been done
throughout the past decade. Artificial intelligence approaches have
been widely adapted in related works such as neural networks,
genetic programming, and Hidden Markov model as well as support
vector machine. However, the performance of the prediction was not
generalized which probably depends on the data used to train and
generate the prediction model. This situation shown the importance
of the features or information used in affecting the prediction
performance. In this research, support vector machine was used to
predict protein residue contact map on different combination of
features in order to show and analyze the effectiveness of the
features.
Abstract: The advancement in wireless technology with the wide
use of mobile devices have drawn the attention of the research and
technological communities towards wireless environments, such as
Wireless Local Area Networks (WLANs), Wireless Wide Area
Networks (WWANs), and mobile systems and ad-hoc networks.
Unfortunately, wired and wireless networks are expressively different
in terms of link reliability, bandwidth, and time of propagation delay
and by adapting new solutions for these enhanced
telecommunications, superior quality, efficiency, and opportunities
will be provided where wireless communications were otherwise
unfeasible. Some researchers define 4G as a significant improvement
of 3G, where current cellular network’s issues will be solved and data
transfer will play a more significant role. For others, 4G unifies
cellular and wireless local area networks, and introduces new routing
techniques, efficient solutions for sharing dedicated frequency bands,
and an increased mobility and bandwidth capacity. This paper
discusses the possible solutions and enhancements probabilities that
proposed to improve the performance of Transmission Control
Protocol (TCP) over different wireless networks and also the paper
investigated each approach in term of advantages and disadvantages.
Abstract: This paper focuses on reducing the power consumption
of wireless sensor networks. Therefore, a communication protocol
named LEACH (Low-Energy Adaptive Clustering Hierarchy) is modified.
We extend LEACHs stochastic cluster-head selection algorithm
by a modifying the probability of each node to become cluster-head
based on its required energy to transmit to the sink. We present
an efficient energy aware routing algorithm for the wireless sensor
networks. Our contribution consists in rotation selection of clusterheads
considering the remoteness of the nodes to the sink, and then,
the network nodes residual energy. This choice allows a best distribution
of the transmission energy in the network. The cluster-heads
selection algorithm is completely decentralized. Simulation results
show that the energy is significantly reduced compared with the
previous clustering based routing algorithm for the sensor networks.
Abstract: Case-Based Reasoning (CBR) is one of machine
learning algorithms for problem solving and learning that caught a lot
of attention over the last few years. In general, CBR is composed of
four main phases: retrieve the most similar case or cases, reuse the
case to solve the problem, revise or adapt the proposed solution, and
retain the learned cases before returning them to the case base for
learning purpose. Unfortunately, in many cases, this retain process
causes the uncontrolled case base growth. The problem affects
competence and performance of CBR systems. This paper proposes
competence-based maintenance method based on deletion policy
strategy for CBR. There are three main steps in this method. Step 1,
formulate problems. Step 2, determine coverage and reachability set
based on coverage value. Step 3, reduce case base size. The results
obtained show that this proposed method performs better than the
existing methods currently discussed in literature.
Abstract: Due to some reasons, observed images are degraded which are mainly caused by noise. Recently image denoising using the wavelet transform has been attracting much attention. Waveletbased approach provides a particularly useful method for image denoising when the preservation of edges in the scene is of importance because the local adaptivity is based explicitly on the values of the wavelet detail coefficients. In this paper, we propose several methods of noise removal from degraded images with Gaussian noise by using adaptive wavelet threshold (Bayes Shrink, Modified Bayes Shrink and Normal Shrink). The proposed thresholds are simple and adaptive to each subband because the parameters required for estimating the threshold depend on subband data. Experimental results show that the proposed thresholds remove noise significantly and preserve the edges in the scene.
Abstract: The study of the interaction between humans and
computers has been emerging during the last few years. This
interaction will be more powerful if computers are able to perceive
and respond to human nonverbal communication such as emotions. In
this study, we present the image-based approach to emotion
classification through lower facial expression. We employ a set of
feature points in the lower face image according to the particular face
model used and consider their motion across each emotive expression
of images. The vector of displacements of all feature points input to
the Adaptive Support Vector Machines (A-SVMs) classifier that
classify it into seven basic emotions scheme, namely neutral, angry,
disgust, fear, happy, sad and surprise. The system was tested on the
Japanese Female Facial Expression (JAFFE) dataset of frontal view
facial expressions [7]. Our experiments on emotion classification
through lower facial expressions demonstrate the robustness of
Adaptive SVM classifier and verify the high efficiency of our
approach.
Abstract: We present a discussion of three adaptive filtering
algorithms well known for their one-step termination property, in
terms of their relationship with the minimal residual method. These
algorithms are the normalized least mean square (NLMS), Affine
Projection algorithm (APA) and the recursive least squares algorithm
(RLS). The NLMS is shown to be a result of the orthogonality
condition imposed on the instantaneous approximation of the Wiener
equation, while APA and RLS algorithm result from orthogonality
condition in multi-dimensional minimal residual formulation. Further
analysis of the minimal residual formulation for the RLS leads to
a triangular system which also possesses the one-step termination
property (in exact arithmetic)
Abstract: Phytotoxicity of Daphne gnidium L. was evaluated
through the effect of incorporating leaves, stems and roots biomass
into soil (at 12.5, 25, 50g/Kg) and irrigation by their aqueous extracts
(50g/L), on the growth of two crops (Lactuca sativa L. and Raphanus
sativus L.) and two weeds (Peaganum harmala L. and Scolymus
maculatus L.). Results revealed a perceptible phytotoxic effect which
increased with dose and concentration. At the highest dose, roots and
leaves residues was the most toxic and caused total inhibition
respectively, for lettuce and thistle seedling growth. Irrigation with
aqueous extracts of D. gnidium different organs decreased also
seedlings length of all test species. Stems extract was more inhibitor
on thistle than peganum seedling growth; it induced a significant
reduction of 80% and 67%, for, respectively, roots and shoots.
Results of the present study suggest that different organs of D.
gnidium could be exploited in the management of agro-ecosystems.
Abstract: In this paper, we propose a robust disease detection
method, called adaptive orientation code matching (Adaptive OCM),
which is developed from a robust image registration algorithm:
orientation code matching (OCM), to achieve continuous and
site-specific detection of changes in plant disease. We use two-stage
framework for realizing our research purpose; in the first stage,
adaptive OCM was employed which could not only realize the
continuous and site-specific observation of disease development, but
also shows its excellent robustness for non-rigid plant object searching
in scene illumination, translation, small rotation and occlusion changes
and then in the second stage, a machine learning method of support
vector machine (SVM) based on a feature of two dimensional (2D)
xy-color histogram is further utilized for pixel-wise disease
classification and quantification. The indoor experiment results
demonstrate the feasibility and potential of our proposed algorithm,
which could be implemented in real field situation for better
observation of plant disease development.
Abstract: The increasing demand for sufficient and clean
energy forces industrial and service companies to align their strategies towards efficient consumption. This trend refers also to the
residential building sector. There, large amounts of energy consumption are caused by house and facility heating. Many of the
operated hot water heating systems lack hydraulic balanced working
conditions for heat distribution and –transmission and lead to
inefficient heating. Through hydraulic balancing of heating systems,
significant energy savings for primary and secondary energy can be
achieved. This paper addresses the use of KNX-technology (Smart
Buildings) in residential buildings to ensure a dynamic adaption of
hydraulic system's performance, in order to increase the heating
system's efficiency. In this paper, the procedure of heating system
segmentation into hydraulically independent units (meshes) is
presented. Within these meshes, the heating valve are addressed and
controlled by a central facility server. Feasibility criteria towards
such drivers will be named. The dynamic hydraulic balance is
achieved by positioning these valves according to heating loads, that
are generated from the temperature settings in the corresponding
rooms. The energetic advantages of single room heating control
procedures, based on the application FacilityManager, is presented.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.