Abstract: Overloading is a technique to accommodate more
number of users than the spreading factor N. This is a bandwidth
efficient scheme to increase the number users in a fixed bandwidth.
One of the efficient schemes to overload a CDMA system is to use
two sets of orthogonal signal waveforms (O/O). The first set is
assigned to the N users and the second set is assigned to the
additional M users. An iterative interference cancellation technique is
used to cancel interference between the two sets of users. In this
paper, the performance of an overloading scheme in which the first N
users are assigned Walsh-Hadamard orthogonal codes and extra users
are assigned the same WH codes but overlaid by a fixed (quasi) bent
sequence [11] is evaluated. This particular scheme is called Quasi-
Orthogonal Sequence (QOS) O/O scheme, which is a part of
cdma2000 standard [12] to provide overloading in the downlink
using single user detector. QOS scheme are balance O/O scheme,
where the correlation between any set-1 and set-2 users are
equalized. The allowable overload of this scheme is investigated in
the uplink on an AWGN and Rayleigh fading channels, so that the
uncoded performance with iterative multistage interference
cancellation detector remains close to the single user bound. It is
shown that this scheme provides 19% and 11% overloading with
SDIC technique for N= 16 and 64 respectively, with an SNR
degradation of less than 0.35 dB as compared to single user bound at
a BER of 0.00001. But on a Rayleigh fading channel, the channel
overloading is 45% (29 extra users) at a BER of 0.0005, with an SNR
degradation of about 1 dB as compared to single user performance
for N=64. This is a significant amount of channel overloading on a
Rayleigh fading channel.
Abstract: A learning content management system (LCMS) is an
environment to support web-based learning content development.
Primary function of the system is to manage the learning process as
well as to generate content customized to meet a unique requirement
of each learner. Among the available supporting tools offered by
several vendors, we propose to enhance the LCMS functionality to
individualize the presented content with the induction ability. Our
induction technique is based on rough set theory. The induced rules
are intended to be the supportive knowledge for guiding the content
flow planning. They can also be used as decision rules to help
content developers on managing content delivered to individual
learner.
Abstract: In this paper, application of Sliding Mode Control (SMC) technique for an Active Magnetic Bearing (AMB) system with varying rotor speed is considered. The gyroscopic effect and mass imbalance inherited in the system is proportional to rotor speed in which this nonlinearity effect causes high system instability as the rotor speed increases. Transformation of the AMB dynamic model into regular system shows that these gyroscopic effect and imbalance lie in the mismatched part of the system. A H2-based sliding surface is designed which bound the mismatched parts. The solution of the surface parameter is obtained using Linear Matrix Inequality (LMI). The performance of the controller applied to the AMB model is demonstrated through simulation works under various system conditions.
Abstract: In this paper channel estimation techniques are
considered as the support methods for OFDM transmission systems
based on Non Binary LDPC (Low Density Parity Check) codes.
Standard frequency domain pilot aided LS (Least Squares) and
LMMSE (Linear Minimum Mean Square Error) estimators are
investigated. Furthermore, an iterative algorithm is proposed as a
solution exploiting the NB-LDPC channel decoder to improve the
performance of the LMMSE estimator. Simulation results of signals
transmitted through fading mobile channels are presented to compare
the performance of the proposed channel estimators.
Abstract: TTV is an unenveloped circular single-stranded DNA
virus with a diameter of 30-32 nm that first was described in 1997 in
Japan. TTV was detected in various populations without proven
pathology, including blood donors and in patients with chronic HBV
and HCV hepatitis. The aim of this study was to determine the
prevalence of TTV DNA in Iranian patients with chronic hepatitis B
and C. Viral TTV-DNA was studied in 442 samples (202 with HBV,
138 with HCV and 102 controls) collected from west south of Iran.
All extracted serum DNA was amplified by TTV ORF1 gene specific
primers using the semi nested PCR technique. TTV DNA was
detected in the serum of 8.9% and 10.8% patients with chronic
hepatitis B and C, respectively. Prevalence of TTV-DNA in the serum
of 102 controls was 2.9%. Results showed significant relation of TTV
with HBV and HCV in patients by using T test examination (P
Abstract: In the framework of the image compression by
Wavelet Transforms, we propose a perceptual method by
incorporating Human Visual System (HVS) characteristics in the
quantization stage. Indeed, human eyes haven-t an equal sensitivity
across the frequency bandwidth. Therefore, the clarity of the
reconstructed images can be improved by weighting the quantization
according to the Contrast Sensitivity Function (CSF). The visual
artifact at low bit rate is minimized. To evaluate our method, we use
the Peak Signal to Noise Ratio (PSNR) and a new evaluating criteria
witch takes into account visual criteria. The experimental results
illustrate that our technique shows improvement on image quality at
the same compression ratio.
Abstract: Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Abstract: Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.
Abstract: This paper presents a new study on the applications of
optimization and regression analysis techniques for optimal
calculation of partial ratios of four-step helical gearboxes for getting
minimal gearbox length. In the paper, basing on the moment
equilibrium condition of a mechanic system including four gear units
and their regular resistance condition, models for determination of the
partial ratios of the gearboxes are proposed. In particular, explicit
models for calculation of the partial ratios are proposed by using
regression analysis. Using these models, the determination of the
partial ratios is accurate and simple.
Abstract: In this research paper, a slotted coaxial line fed cross
dipole excitation structure for short backfire antenna is proposed and
developed to achieve reconfigurable circular polarization. The cross
dipole, which is fed by the slotted coaxial line, consists of two
orthogonal dipoles. The dipoles are mounted on the outer conductor
of the coaxial line. A unique technique is developed to generate
reconfigurable circular polarization using cross dipole configuration.
The sub-reflector is supported by the feed line, thus requiring no
extra support. The antenna is developed on elliptical ground plane
with dielectric rim making antenna compact. It is demonstrated that
cross dipole excited short backfire antenna can achieve voltage
standing wave ratio (VSWR) bandwidth of 14.28% for 2:1 VSWR,
axial ratio of 0.2 dB with axial ratio (≤ 3dB) bandwidth of 2.14% and
a gain of more than 12 dBi. The experimental results for the designed
antenna structure are in close agreement with computer simulations.
Abstract: The issue of classifying objects into one of predefined
groups when the measured variables are mixed with different types
of variables has been part of interest among statisticians in many
years. Some methods for dealing with such situation have been
introduced that include parametric, semi-parametric and nonparametric
approaches. This paper attempts to discuss on a problem
in classifying a data when the number of measured mixed variables is
larger than the size of the sample. A propose idea that integrates a
dimensionality reduction technique via principal component analysis
and a discriminant function based on the location model is discussed.
The study aims in offering practitioners another potential tool in a
classification problem that is possible to be considered when the
observed variables are mixed and too large.
Abstract: Combined therapy using Interferon and Ribavirin is the standard treatment in patients with chronic hepatitis C. However, the number of responders to this treatment is low, whereas its cost and side effects are high. Therefore, there is a clear need to predict patient’s response to the treatment based on clinical information to protect the patients from the bad drawbacks, Intolerable side effects and waste of money. Different machine learning techniques have been developed to fulfill this purpose. From these techniques are Associative Classification (AC) and Decision Tree (DT). The aim of this research is to compare the performance of these two techniques in the prediction of virological response to the standard treatment of HCV from clinical information. 200 patients treated with Interferon and Ribavirin; were analyzed using AC and DT. 150 cases had been used to train the classifiers and 50 cases had been used to test the classifiers. The experiment results showed that the two techniques had given acceptable results however the best accuracy for the AC reached 92% whereas for DT reached 80%.
Abstract: This paper highlights some interesting facts on South African-s waste situation and management strategies, in particular the Integrated Waste Management. South Africa supports a waste hierarchy by promoting cleaner production, waste minimisation, reuse, recycling and waste treatment with disposal and remediation as the last preferred options in waste management. The drivers for waste management techniques are identified as increased demand for waste service provision; increased demand for waste minimisation; recycling and recovery; land use, physical and environmental limitations; and socio-economic and demographic factors. The South African government recognizes the importance of scientific research as outlined on the white paper on Integrated Pollution and Waste Management (IP and WM) (DEAT, 2000).
Abstract: Estimates of temperature values at a specific time of day, from daytime and daily profiles, are needed for a number of environmental, ecological, agricultural and technical applications, ranging from natural hazards assessments, crop growth forecasting to design of solar energy systems. The scope of this research is to investigate the efficiency of data mining techniques in estimating minimum, maximum and mean temperature values. For this reason, a number of experiments have been conducted with well-known regression algorithms using temperature data from the city of Patras in Greece. The performance of these algorithms has been evaluated using standard statistical indicators, such as Correlation Coefficient, Root Mean Squared Error, etc.
Abstract: A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.
Abstract: In Blind Source Separation (BSS) processing, taking
advantage of scaling factor indetermination and based on the floatingpoint
representation, we propose a scaling technique applied to the
separation matrix, to avoid the saturation or the weakness in the
recovered source signals. This technique performs an Automatic Gain
Control (AGC) in an on-line BSS environment. We demonstrate
the effectiveness of this technique by using the implementation of
a division free BSS algorithm with two input, two output. This
technique is computationally cheaper and efficient for a hardware
implementation.
Abstract: P2P Networks are highly dynamic structures since
their nodes – peer users keep joining and leaving continuously. In the
paper, we study the effects of network change rates on query routing
efficiency. First we describe some background and an abstract system
model. The chosen routing technique makes use of cached metadata
from previous answer messages and also employs a mechanism for
broken path detection and metadata maintenance. Several metrics are
used to show that the protocol behaves quite well even with high rate
of node departures, but above a certain threshold it literally breaks
down and exhibits considerable efficiency degradation.
Abstract: Bio-electrical responses obtained from freshwater
sediments by employing microbial fuel cell (MFC) technology were
investigated in this experimental study. During the electricity
generation, organic matter in the sediment was microbially oxidized
under anaerobic conditions with an electrode serving as a terminal
electron acceptor. It was found that the sediment organic matter
(SOM) associated with electrochemically-active electrodes became
more humified, aromatic, and polydispersed, and had a higher average
molecular weight, together with the decrease in the quantity of SOM.
The alteration of characteristics of the SOM was analogous to that
commonly observed in the early stage of SOM diagenetic process (i.e.,
humification). These findings including an elevation of the sediment
redox potential present a possibility of the MFC technology as a new
soil/sediment remediation technique based on its potential benefits:
non-destructive electricity generation and bioremediation.
Abstract: Skin color can provide a useful and robust cue
for human-related image analysis, such as face detection,
pornographic image filtering, hand detection and tracking,
people retrieval in databases and Internet, etc. The major
problem of such kinds of skin color detection algorithms is
that it is time consuming and hence cannot be applied to a real
time system. To overcome this problem, we introduce a new
fast technique for skin detection which can be applied in a real
time system. In this technique, instead of testing each image
pixel to label it as skin or non-skin (as in classic techniques),
we skip a set of pixels. The reason of the skipping process is
the high probability that neighbors of the skin color pixels are
also skin pixels, especially in adult images and vise versa. The
proposed method can rapidly detect skin and non-skin color
pixels, which in turn dramatically reduce the CPU time
required for the protection process. Since many fast detection
techniques are based on image resizing, we apply our
proposed pixel skipping technique with image resizing to
obtain better results. The performance evaluation of the
proposed skipping and hybrid techniques in terms of the
measured CPU time is presented. Experimental results
demonstrate that the proposed methods achieve better result
than the relevant classic method.
Abstract: The information on the Web increases tremendously.
A number of search engines have been developed for searching Web
information and retrieving relevant documents that satisfy the
inquirers needs. Search engines provide inquirers irrelevant
documents among search results, since the search is text-based rather
than semantic-based. Information retrieval research area has
presented a number of approaches and methodologies such as
profiling, feedback, query modification, human-computer interaction,
etc for improving search results. Moreover, information retrieval has
employed artificial intelligence techniques and strategies such as
machine learning heuristics, tuning mechanisms, user and system
vocabularies, logical theory, etc for capturing user's preferences and
using them for guiding the search based on the semantic analysis
rather than syntactic analysis. Although a valuable improvement has
been recorded on search results, the survey has shown that still
search engines users are not really satisfied with their search results.
Using ontologies for semantic-based searching is likely the key
solution. Adopting profiling approach and using ontology base
characteristics, this work proposes a strategy for finding the exact
meaning of the query terms in order to retrieve relevant information
according to user needs. The evaluation of conducted experiments
has shown the effectiveness of the suggested methodology and
conclusion is presented.