Abstract: This paper describes a new supervised fusion (hybrid)
electrocardiogram (ECG) classification solution consisting of a new
QRS complex geometrical feature extraction as well as a new version
of the learning vector quantization (LVQ) classification algorithm
aimed for overcoming the stability-plasticity dilemma. Toward this
objective, after detection and delineation of the major events of ECG
signal via an appropriate algorithm, each QRS region and also its
corresponding discrete wavelet transform (DWT) are supposed as
virtual images and each of them is divided into eight polar sectors.
Then, the curve length of each excerpted segment is calculated
and is used as the element of the feature space. To increase the
robustness of the proposed classification algorithm versus noise,
artifacts and arrhythmic outliers, a fusion structure consisting of
five different classifiers namely as Support Vector Machine (SVM),
Modified Learning Vector Quantization (MLVQ) and three Multi
Layer Perceptron-Back Propagation (MLP–BP) neural networks with
different topologies were designed and implemented. The new proposed
algorithm was applied to all 48 MIT–BIH Arrhythmia Database
records (within–record analysis) and the discrimination power of the
classifier in isolation of different beat types of each record was
assessed and as the result, the average accuracy value Acc=98.51%
was obtained. Also, the proposed method was applied to 6 number
of arrhythmias (Normal, LBBB, RBBB, PVC, APB, PB) belonging
to 20 different records of the aforementioned database (between–
record analysis) and the average value of Acc=95.6% was achieved.
To evaluate performance quality of the new proposed hybrid learning
machine, the obtained results were compared with similar peer–
reviewed studies in this area.
Abstract: As networking has become popular, Web-learning
tends to be a trend while designing a tool. Moreover, five-axis
machining has been widely used in industry recently; however, it has
potential axial table colliding problems. Thus this paper aims at
proposing an efficient web-learning collision detection tool on
five-axis machining. However, collision detection consumes heavy
resource that few devices can support, thus this research uses a
systematic approach based on web knowledge to detect collision. The
methodologies include the kinematics analyses for five-axis motions,
separating axis method for collision detection, and computer
simulation for verification. The machine structure is modeled as STL
format in CAD software. The input to the detection system is the
g-code part program, which describes the tool motions to produce the
part surface. This research produced a simulation program with C
programming language and demonstrated a five-axis machining
example with collision detection on web site. The system simulates the
five-axis CNC motion for tool trajectory and detects for any collisions
according to the input g-codes and also supports high-performance
web service benefiting from C. The result shows that our method
improves 4.5 time of computational efficiency, comparing to the
conventional detection method.
Abstract: This research contribution is drafted to present the
orbit design, orbit propagator and geomagnetic field estimator for the
nanosatellites specifically for the upcoming CUBESAT, ICUBE-1 of
the Institute of Space Technology (IST), Islamabad, Pakistan. The
ICUBE mission is designed for the low earth orbit at the approximate
height of 700KM. The presented research endeavor designs the
Keplarian elements for ICUBE-1 orbit while incorporating the
mission requirements and propagates the orbit using J2 perturbations,
The attitude determination system of the ICUBE-1 consists of
attitude determination sensors like magnetometer and sun sensor. The
Geomagnetic field estimator is developed according to the model of
International Geomagnetic Reference Field (IGRF) for comparing the
magnetic field measurements by the magnetometer for attitude
determination. The output of the propagator namely the Keplarians
position and velocity vectors and the magnetic field vectors are
compared and verified with the same scenario generated in the
Satellite Tool Kit (STK).
Abstract: The paper presents an overview of environmental
issues that may be expected with nuclear desalination. The analysis
of coupling nuclear power with desalination plants indicates that
adverse marine impacts can be mitigated with alternative intake
designs or cooling systems. The atmospheric impact of desalination
may be greatly reduced through the coupling with nuclear power,
while maximizing the socio-economic benefit for both processes. The
potential for tritium contamination of the desalinated water was
reviewed. Experience with the systems and practices related to the
radiological quality of the product water, shows no examples of
cross-contamination. Furthermore, the indicators for the public
acceptance of nuclear desalination, as one of the most important
sustainability aspects of any such large project, show a positive trend.
From the data collected, a conclusion is made that nuclear
desalination should be supported by decision-makers.
Abstract: Fixed-point simulation results are used for the
performance measure of inverting matrices by Cholesky
decomposition. The fixed-point Cholesky decomposition algorithm
is implemented using a fixed-point reconfigurable processing
element. The reconfigurable processing element provides all
mathematical operations required by Cholesky decomposition. The
fixed-point word length analysis is based on simulations using
different condition numbers and different matrix sizes. Simulation
results show that 16 bits word length gives sufficient performance
for small matrices with low condition number. Larger matrices and
higher condition numbers require more dynamic range for a fixedpoint
implementation.
Abstract: In this paper is investigated a possible
optimization of some linear algebra problems which can be
solved by parallel processing using the special arrays called
systolic arrays. In this paper are used some special types of
transformations for the designing of these arrays. We show
the characteristics of these arrays. The main focus is on
discussing the advantages of these arrays in parallel
computation of matrix product, with special approach to the
designing of systolic array for matrix multiplication.
Multiplication of large matrices requires a lot of
computational time and its complexity is O(n3 ). There are
developed many algorithms (both sequential and parallel) with
the purpose of minimizing the time of calculations. Systolic
arrays are good suited for this purpose. In this paper we show
that using an appropriate transformation implicates in finding
more optimal arrays for doing the calculations of this type.
Abstract: In this paper the reference current for Voltage Source
Converter (VSC) of the Shunt Active Power Filter (SAPF) is
generated using Synchronous Reference Frame method,
incorporating the PI controller with anti-windup scheme. The
proposed method improves the harmonic filtering by compensating
the winding up phenomenon caused by the integral term of the PI
controller.
Using Reference Frame Transformation, the current is transformed
from om a - b - c stationery frame to rotating 0 - d - q frame. Using
the PI controller, the current in the 0 - d - q frame is controlled to
get the desired reference signal. A controller with integral action
combined with an actuator that becomes saturated can give some
undesirable effects. If the control error is so large that the integrator
saturates the actuator, the feedback path becomes ineffective because
the actuator will remain saturated even if the process output changes.
The integrator being an unstable system may then integrate to a very
large value, the phenomenon known as integrator windup.
Implementing the integrator anti-windup circuit turns off the
integrator action when the actuator saturates, hence improving the
performance of the SAPF and dynamically compensating harmonics
in the power network. In this paper the system performance is
examined with Shunt Active Power Filter simulation model.
Abstract: Adhesively bonded joints are preferred over the
conventional methods of joining such as riveting, welding, bolting
and soldering. Some of the main advantages of adhesive joints
compared to conventional joints are the ability to join dissimilar
materials and damage-sensitive materials, better stress distribution,
weight reduction, fabrication of complicated shapes, excellent
thermal and insulation properties, vibration response and enhanced
damping control, smoother aerodynamic surfaces and an
improvement in corrosion and fatigue resistance. This paper presents
the behavior of adhesively bonded joints subjected to combined
thermal loadings, using the numerical methods. The joint
configuration considers aluminum as central adherend with six
different outer adherends including aluminum, steel, titanium, boronepoxy,
unidirectional graphite-epoxy and cross-ply graphite-epoxy
and epoxy-based adhesives. Free expansion of the joint in x
direction was permitted and stresses in adhesive layer and interfaces
calculated for different adherends.
Abstract: A procedure for the preparation of clarified Pawpaw
Juice was developed. About 750ml Pawpaw pulp was measured into
2 measuring cylinders A & B of capacity 1 litre heated to 400C,
cooled to 200C. 30mls pectinase was added into cylinder A, while
30mls distilled water was added into cylinder B. Enzyme treated
sample (A) was allowed to digest for 5hours after which it was heated
to 900C for 15 minutes to inactivate the enzyme. The heated sample
was cooled and with the aid of a mucillin cloth the pulp was filtered
to obtain the clarified pawpaw juice. The juice was filled into 100ml
plastic bottles, pasteurized at 950C for 45 minutes, cooled and stored
at room temperature. The sample treated with 30mls distilled water
also underwent the same process. Freshly pasteurized sample was
analyzed for specific gravity, titratable acidity, pH, sugars and
ascorbic acid. The remaining sample was then stored for 2 weeks and
the above analyses repeated. There were differences in the results of
the freshly pasteurized samples and stored sample in pH and ascorbic
acid levels, also sample treated with pectinase yielded higher
volumes of juice than that treated with distilled water.
Abstract: Phishing, or stealing of sensitive information on the
web, has dealt a major blow to Internet Security in recent times. Most
of the existing anti-phishing solutions fail to handle the fuzziness
involved in phish detection, thus leading to a large number of false
positives. This fuzziness is attributed to the use of highly flexible and
at the same time, highly ambiguous HTML language. We introduce a
new perspective against phishing, that tries to systematically prove,
whether a given page is phished or not, using the corresponding
original page as the basis of the comparison. It analyzes the layout of
the pages under consideration to determine the percentage distortion
between them, indicative of any form of malicious alteration. The
system design represents an intelligent system, employing dynamic
assessment which accurately identifies brand new phishing attacks
and will prove effective in reducing the number of false positives.
This framework could potentially be used as a knowledge base, in
educating the internet users against phishing.
Abstract: Prediction of bacterial virulent protein sequences can
give assistance to identification and characterization of novel
virulence-associated factors and discover drug/vaccine targets against
proteins indispensable to pathogenicity. Gene Ontology (GO)
annotation which describes functions of genes and gene products as a
controlled vocabulary of terms has been shown effectively for a
variety of tasks such as gene expression study, GO annotation
prediction, protein subcellular localization, etc. In this study, we
propose a sequence-based method Virulent-GO by mining informative
GO terms as features for predicting bacterial virulent proteins.
Each protein in the datasets used by the existing method
VirulentPred is annotated by using BLAST to obtain its homologies
with known accession numbers for retrieving GO terms. After
investigating various popular classifiers using the same five-fold
cross-validation scheme, Virulent-GO using the single kind of GO
term features with an accuracy of 82.5% is slightly better than
VirulentPred with 81.8% using five kinds of sequence-based features.
For the evaluation of independent test, Virulent-GO also yields better
results (82.0%) than VirulentPred (80.7%). When evaluating single
kind of feature with SVM, the GO term feature performs much well,
compared with each of the five kinds of features.
Abstract: In cellular networks, limited availability of resources
has to be tapped to its fullest potential. In view of this aspect, a
sophisticated averaging and voting technique has been discussed in
this paper, wherein the radio resources available are utilized to the
fullest value by taking into consideration, several network and radio
parameters which decide on when the handover has to be made and
thereby reducing the load on Base station .The increase in the load
on the Base station might be due to several unnecessary handover
taking place which can be eliminated by making judicious use of the
radio and network parameters.
Abstract: Recent trends in building constructions in Libya are
more toward tall (high-rise) building projects. As a consequence, a
better estimation of the lateral loading in the design process is
becoming the focal of a safe and cost effective building industry. Byin-
large, Libya is not considered a potential earthquake prone zone,
making wind is the dominant design lateral loads. Current design
practice in the country estimates wind speeds on a mere random
bases by considering certain factor of safety to the chosen wind
speed. Therefore, a need for a more accurate estimation of wind
speeds in Libya was the motivation behind this study. Records of
wind speed data were collected from 22 metrological stations in
Libya, and were statistically analysed. The analysis of more than four
decades of wind speed records suggests that the country can be
divided into four zones of distinct wind speeds. A computer “survey"
program was manipulated to draw design wind speeds contour map
for the state of Libya.
The paper presents the statistical analysis of Libya-s recorded
wind speed data and proposes design wind speed values for a 50-year
return period that covers the entire country.
Abstract: Fermented cassava flours (lafun) sold in Ogun and Oyo
States of Nigeria were collected from 10 markets for a period of two
months and analysed to determine their safety status. The presence of
trace metals was due to high vehicular movement around the drying
sites and markets. Cyanide and moisture contents of samples were
also determined to assess the adequacy of fermentation and drying.
The result showed that sample OWO was found to have the highest
amount of 16.02±0.12mg/kg cyanide while the lowest was found in
sample OJO with 10.51±0.10mg/kg. The results also indicated that
sample TVE had the highest moisture content of 18.50±0.20% while
sample OWO had the lowest amount of 12.46±0.47%. Copper and
lead levels were found to be highest in TVE with values 28.10mg/kg
and 1.1mg/kg respectively, while sample BTS had the lowest values
of 20.6mg/kg and 0.05mg/kg respectively. High value of cyanide
indicated inadequate fermentation.
Abstract: In this paper, we propose a face recognition algorithm
using AAM and Gabor features. Gabor feature vectors which are well
known to be robust with respect to small variations of shape, scaling,
rotation, distortion, illumination and poses in images are popularly
employed for feature vectors for many object detection and
recognition algorithms. EBGM, which is prominent among face
recognition algorithms employing Gabor feature vectors, requires
localization of facial feature points where Gabor feature vectors are
extracted. However, localization method employed in EBGM is based
on Gabor jet similarity and is sensitive to initial values. Wrong
localization of facial feature points affects face recognition rate. AAM
is known to be successfully applied to localization of facial feature
points. In this paper, we devise a facial feature point localization
method which first roughly estimate facial feature points using AAM
and refine facial feature points using Gabor jet similarity-based facial
feature localization method with initial points set by the rough facial
feature points obtained from AAM, and propose a face recognition
algorithm using the devised localization method for facial feature
localization and Gabor feature vectors. It is observed through
experiments that such a cascaded localization method based on both
AAM and Gabor jet similarity is more robust than the localization
method based on only Gabor jet similarity. Also, it is shown that the
proposed face recognition algorithm using this devised localization
method and Gabor feature vectors performs better than the
conventional face recognition algorithm using Gabor jet
similarity-based localization method and Gabor feature vectors like
EBGM.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: In this paper we study the fuzzy c-mean clustering algorithm
combined with principal components method. Demonstratively
analysis indicate that the new clustering method is well rather than
some clustering algorithms. We also consider the validity of clustering
method.
Abstract: The purpose of this study is to present a non invasive
method for the marginal adaptation evaluation in class V composite
restorations. Standardized class V cavities, prepared in human
extracted teeth, were filled with Premise (Kerr) composite. The
specimens were thermo cycled. The interfaces were examined by
Optical Coherence Tomography method (OCT) combined with the
confocal microscopy and fluorescence. The optical configuration
uses two single mode directional couplers with a superluminiscent
diode as the source at 1300 nm. The scanning procedure is similar to
that used in any confocal microscope, where the fast scanning is enface
(line rate) and the depth scanning is much slower (at the frame
rate). Gaps at the interfaces as well as inside the composite resin
materials were identified. OCT has numerous advantages which
justify its use in vivo as well as in vitro in comparison with
conventional techniques.
Abstract: The output beam quality of multi transverse modes of
laser, are relatively poor. In order to obtain better beam quality, one
may use an aperture inside the laser resonator. In this case, various
transverse modes can be selected. We have selected various
transverse modes both by simulation and doing experiment. By
inserting a circular aperture inside the diode end-pumped Nd:YAG
pulsed laser resonator, we have obtained 00 TEM , 01 TEM
, 20 TEM and have studied which parameters, can change the mode
shape. Then, we have determined the beam quality factor of TEM00
gaussian mode.
Abstract: In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.