Abstract: This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.
Abstract: Snake bite cases in Malaysia most often involve the
species Naja-naja and Calloselasma rhodostoma. In keeping with the
need for a rapid snake venom detection kit in a clinical setting, plate
and dot-ELISA test for the venoms of Naja-naja sumatrana,
Calloselasma rhodostoma and the cobra venom fraction V antigen
was developed. Polyclonal antibodies were raised and further used to
prepare the reagents for the dot-ELISA test kit which was tested in
mice, rabbit and virtual human models. The newly developed dot-
ELISA kit was able to detect a minimum venom concentration of
244ng/ml with cross reactivity of one antibody type. The dot-ELISA
system was sensitive and specific for all three snake venom types in
all tested animal models. The lowest minimum venom concentration
detectable was in the rabbit model, 244ng/ml of the cobra venom
fraction V antigen. The highest minimum venom concentration was
in mice, 1953ng/ml against a multitude of venoms. The developed
dot-ELISA system for the detection of three snake venom types was
successful with a sensitivity of 95.8% and specificity of 97.9%.
Abstract: This paper presented a modified efficient inductive
powering link based on ASK modulator and proposed efficient class-
E power amplifier. The design presents the external part which is
located outside the body to transfer power and data to the implanted
devices such as implanted Microsystems to stimulate and monitoring
the nerves and muscles. The system operated with low band
frequency 10MHZ according to industrial- scientific – medical (ISM)
band to avoid the tissue heating. For external part, the modulation
index is 11.1% and the modulation rate 7.2% with data rate 1 Mbit/s
assuming Tbit = 1us. The system has been designed using 0.35-μm
fabricated CMOS technology. The mathematical model is given and
the design is simulated using OrCAD P Spice 16.2 software tool and
for real-time simulation, the electronic workbench MULISIM 11 has
been used.
Abstract: Least Significant Bit (LSB) technique is the earliest
developed technique in watermarking and it is also the most simple,
direct and common technique. It essentially involves embedding the
watermark by replacing the least significant bit of the image data with
a bit of the watermark data. The disadvantage of LSB is that it is not
robust against attacks. In this study intermediate significant bit (ISB)
has been used in order to improve the robustness of the watermarking
system. The aim of this model is to replace the watermarked image
pixels by new pixels that can protect the watermark data against
attacks and at the same time keeping the new pixels very close to the
original pixels in order to protect the quality of watermarked image.
The technique is based on testing the value of the watermark pixel
according to the range of each bit-plane.
Abstract: This paper presents a new circuit arrangement for a
current-mode Wheatstone bridge that is suitable for low-voltage
integrated circuits implementation. Compared to the other proposed
circuits, this circuit features severe reduction of the elements number,
low supply voltage (1V) and low power consumption (
Abstract: Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Abstract: In the present work, we propose a new technique to
enhance the learning capabilities and reduce the computation
intensity of a competitive learning multi-layered neural network
using the K-means clustering algorithm. The proposed model use
multi-layered network architecture with a back propagation learning
mechanism. The K-means algorithm is first applied to the training
dataset to reduce the amount of samples to be presented to the neural
network, by automatically selecting an optimal set of samples. The
obtained results demonstrate that the proposed technique performs
exceptionally in terms of both accuracy and computation time when
applied to the KDD99 dataset compared to a standard learning
schema that use the full dataset.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.
Abstract: This paper simulates the ad-hoc mesh network in rural areas, where such networks receive great attention due to their cost, since installing the infrastructure for regular networks in these areas is not possible due to the high cost. The distance between the communicating nodes is the most obstacles that the ad-hoc mesh network will face. For example, in Terranet technology, two nodes can communicate if they are only one kilometer far from each other. However, if the distance between them is more than one kilometer, then each node in the ad-hoc mesh networks has to act as a router that forwards the data it receives to other nodes. In this paper, we try to find the critical number of nodes which makes the network fully connected in a particular area, and then propose a method to enhance the intermediate node to accept to be a router to forward the data from the sender to the receiver. Much work was done on technological changes on peer to peer networks, but the focus of this paper will be on another feature which is to find the minimum number of nodes needed for a particular area to be fully connected and then to enhance the users to switch on their phones and accept to work as a router for other nodes. Our method raises the successful calls to 81.5% out of 100% attempt calls.
Abstract: The element of justice or al-‘adl in the context of
Islamic critical thinking deals with the notion of justice in a thinking
process which critically rationalizes the truth in a fair and objective
manner with no irrelevant interference that can jeopardize a sound
judgment. This Islamic axiological element is vital in technological
decision making as it addresses the issues of religious values and
ethics that are primarily set to fulfill the purpose of human life on
earth. The main objective of this study was to examine and analyze
the perception of Muslim engineering students in Malaysian higher
education institutions towards the concept of al-‘adl as an essential
element of Islamic critical thinking. The study employed mixed
methods approach that comprises data collection from the
questionnaire survey and the interview responses. A total of 557
Muslim engineering undergraduates from six Malaysian universities
participated in the study. The study generally indicated that Muslim
engineering undergraduates in the higher institutions have rather
good comprehension and consciousness for al-‘adl with a slight
awareness on the importance of objective thinking. Nonetheless there
were a few items on the concept that have implied a comparatively
low perception on the rational justice in Islam as the means to grasp
the ultimate truth.
Abstract: Ultrathin (UTD) and Nanoscale (NSD) SOI-MOSFET devices, sharing a similar W/L but with a channel thickness of 46nm and 1.6nm respectively, were fabricated using a selective “gate recessed” process on the same silicon wafer. The electrical transport characterization at room temperature has shown a large difference between the two kinds of devices and has been interpreted in terms of a huge unexpected series resistance. Electrical characteristics of the Nanoscale device, taken in the linear region, can be analytically derived from the ultrathin device ones. A comparison of the structure and composition of the layers, using advanced techniques such as Focused Ion Beam (FIB) and High Resolution TEM (HRTEM) coupled with Energy Dispersive X-ray Spectroscopy (EDS), contributes an explanation as to the difference of transport between the devices.
Abstract: A method and apparatus for noninvasive measurement
of blood glucose concentration based on transilluminated laser beam
via the Index Finger has been reported in this paper. This method
depends on atomic gas (He-Ne) laser operating at 632.8nm
wavelength. During measurement, the index finger is inserted into the
glucose sensing unit, the transilluminated optical signal is converted
into an electrical signal, compared with the reference electrical
signal, and the obtained difference signal is processed by signal
processing unit which presents the results in the form of blood
glucose concentration. This method would enable the monitoring
blood glucose level of the diabetic patient continuously, safely and
noninvasively.
Abstract: This work deals with unsupervised image deblurring.
We present a new deblurring procedure on images provided by lowresolution
synthetic aperture radar (SAR) or simply by multimedia in
presence of multiplicative (speckle) or additive noise, respectively.
The method we propose is defined as a two-step process. First, we
use an original technique for noise reduction in wavelet domain.
Then, the learning of a Kohonen self-organizing map (SOM) is
performed directly on the denoised image to take out it the blur. This
technique has been successfully applied to real SAR images, and the
simulation results are presented to demonstrate the effectiveness of
the proposed algorithms.
Abstract: In our modern world, more physical transactions are being substituted by electronic transactions (i.e. banking, shopping, and payments), many businesses and companies are performing most of their operations through the internet. Instead of having a physical commerce, internet visitors are now adapting to electronic commerce (e-Commerce). The ability of web users to reach products worldwide can be greatly benefited by creating friendly and personalized online business portals. Internet visitors will return to a particular website when they can find the information they need or want easily. Dealing with this human conceptualization brings the incorporation of Artificial/Computational Intelligence techniques in the creation of customized portals. From these techniques, Fuzzy-Set technologies can make many useful contributions to the development of such a human-centered endeavor as e-Commerce. The main objective of this paper is the implementation of a Paradigm for the Intelligent Design and Operation of Human-Computer interfaces. In particular, the paradigm is quite appropriate for the intelligent design and operation of software modules that display information (such Web Pages, graphic user interfaces GUIs, Multimedia modules) on a computer screen. The human conceptualization of the user personal information is analyzed throughout a Cascaded Fuzzy Inference (decision-making) System to generate the User Ascribe Qualities, which identify the user and that can be used to customize portals with proper Web links.
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.
Abstract: The need in cognitive radio system for a simple, fast, and independent technique to sense the spectrum occupancy has led to the energy detection approach. Energy detector is known by its dependency on noise variation in the system which is one of its major drawbacks. In this paper, we are aiming to improve its performance by utilizing a weighted collaborative spectrum sensing, it is similar to the collaborative spectrum sensing methods introduced previously in the literature. These weighting methods give more improvement for collaborative spectrum sensing as compared to no weighting case. There is two method proposed in this paper: the first one depends on the channel status between each sensor and the primary user while the second depends on the value of the energy measured in each sensor.
Abstract: In this paper, the problem of stability analysis for a class of impulsive stochastic fuzzy neural networks with timevarying delays and reaction-diffusion is considered. By utilizing suitable Lyapunov-Krasovskii funcational, the inequality technique and stochastic analysis technique, some sufficient conditions ensuring global exponential stability of equilibrium point for impulsive stochastic fuzzy cellular neural networks with time-varying delays and diffusion are obtained. In particular, the estimate of the exponential convergence rate is also provided, which depends on system parameters, diffusion effect and impulsive disturbed intention. It is believed that these results are significant and useful for the design and applications of fuzzy neural networks. An example is given to show the effectiveness of the obtained results.
Abstract: The aim of every software product is to achieve an
appropriate level of software quality. Developers and designers are
trying to produce readable, reliable, maintainable, reusable and
testable code. To help achieve these goals, several approaches have
been utilized. In this paper, refactoring technique was used to
evaluate software quality with a quality index. It is composed of
different metric sets which describes various quality aspects.
Abstract: This paper proposes a new optimization techniques
for the optimization a gas processing plant uncertain feed and
product flows. The problem is first formulated using a continuous
linear deterministic approach. Subsequently, the single and joint
chance constraint models for steady state process with timedependent
uncertainties have been developed. The solution approach
is based on converting the probabilistic problems into their
equivalent deterministic form and solved at different confidence
levels Case study for a real plant operation has been used to
effectively implement the proposed model. The optimization results
indicate that prior decision has to be made for in-operating plant
under uncertain feed and product flows by satisfying all the
constraints at 95% confidence level for single chance constrained and
85% confidence level for joint chance constrained optimizations
cases.
Abstract: Over the past decade, mobile has experienced a
revolution that will ultimately change the way we communicate.All
these technologies have a common denominator exploitation of
computer information systems, but their operation can be tedious
because of problems with heterogeneous data sources.To overcome
the problems of heterogeneous data sources, we propose to use a
technique of adding an extra layer interfacing applications of
management or supervision at the different data sources.This layer
will be materialized by the implementation of a mediator between
different host applications and information systems frequently used
hierarchical and relational manner such that the heterogeneity is
completely transparent to the VoIP platform.