Abstract: Terminal localization for indoor Wireless Local Area
Networks (WLANs) is critical for the deployment of location-aware
computing inside of buildings. A major challenge is obtaining high
localization accuracy in presence of fluctuations of the received signal
strength (RSS) measurements caused by multipath fading. This paper
focuses on reducing the effect of the distance-varying noise by spatial
filtering of the measured RSS. Two different survey point geometries
are tested with the noise reduction technique: survey points arranged
in sets of clusters and survey points uniformly distributed over the
network area. The results show that the location accuracy improves
by 16% when the filter is used and by 18% when the filter is applied
to a clustered survey set as opposed to a straight-line survey set.
The estimated locations are within 2 m of the true location, which
indicates that clustering the survey points provides better localization
accuracy due to superior noise removal.
Abstract: In this paper we present a general formalism for the
establishment of the family of selective regressor affine projection
algorithms (SR-APA). The SR-APA, the SR regularized APA (SR-RAPA),
the SR partial rank algorithm (SR-PRA), the SR binormalized
data reusing least mean squares (SR-BNDR-LMS), and the SR normalized
LMS with orthogonal correction factors (SR-NLMS-OCF)
algorithms are established by this general formalism. We demonstrate
the performance of the presented algorithms through simulations in
acoustic echo cancellation scenario.
Abstract: This paper presents the convergence analysis
of a prediction based blind equalizer for IIR channels.
Predictor parameters are estimated by using the recursive
least squares algorithm. It is shown that the prediction
error converges almost surely (a.s.) toward a scalar
multiple of the unknown input symbol sequence. It is
also proved that the convergence rate of the parameter
estimation error is of the same order as that in the iterated
logarithm law.
Abstract: The ever increasing use of World Wide Web in the
existing network, results in poor performance. Several techniques
have been developed for reducing web traffic by compressing the size
of the file, saving the web pages at the client side, changing the burst
nature of traffic into constant rate etc. No single method was
adequate enough to access the document instantly through the
Internet. In this paper, adaptive hybrid algorithms are developed for
reducing web traffic. Intelligent agents are used for monitoring the
web traffic. Depending upon the bandwidth usage, user-s preferences,
server and browser capabilities, intelligent agents use the best
techniques to achieve maximum traffic reduction. Web caching,
compression, filtering, optimization of HTML tags, and traffic
dispersion are incorporated into this adaptive selection. Using this
new hybrid technique, latency is reduced to 20 – 60 % and cache hit
ratio is increased 40 – 82 %.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: A mobile agent is a software which performs an
action autonomously and independently as a person or an
organizations assistance. Mobile agents are used for searching
information, retrieval information, filtering, intruder recognition in
networks, and so on. One of the important issues of mobile agent is
their security. It must consider different security issues in effective
and secured usage of mobile agent. One of those issues is the
integrity-s protection of mobile agents.
In this paper, the advantages and disadvantages of each method,
after reviewing the existing methods, is examined. Regarding to this
matter that each method has its own advantage or disadvantage, it
seems that by combining these methods, one can reach to a better
method for protecting the integrity of mobile agents. Therefore, this
method is provided in this paper and then is evaluated in terms of
existing method. Finally, this method is simulated and its results are
the sign of improving the possibility of integrity-s protection of
mobile agents.
Abstract: It is well known that during the developments in the
economic sector and through the financial crises occur everywhere in
the whole world, volatility measurement is the most important
concept in financial time series. Therefore in this paper we discuss
the volatility for Amman stocks market (Jordan) for certain period of
time. Since wavelet transform is one of the most famous filtering
methods and grows up very quickly in the last decade, we compare
this method with the traditional technique, Fast Fourier transform to
decide the best method for analyzing the volatility. The comparison
will be done on some of the statistical properties by using Matlab
program.
Abstract: In this paper, we propose a new approach to query-by-humming, focusing on MP3 songs database. Since MP3 songs are much more difficult in melody representation than symbolic performance data, we adopt to extract feature descriptors from the vocal sounds part of the songs. Our approach is based on signal filtering, sub-band spectral processing, MDCT coefficients analysis and peak energy detection by ignorance of the background music as much as possible. Finally, we apply dual dynamic programming algorithm for feature similarity matching. Experiments will show us its online performance in precision and efficiency.