Abstract: This paper addresses the problem of building a unified
structure to describe a peer-to-peer system. Our approach uses the
well-known notations in the P2P area, and provides a global
architecture that puts a separation between the platform specific
characteristics and the logical ones. In order to enable the navigation
of the peer across platforms, a roaming layer is added. The latter
provides a capability to define a unique identification of peer and
assures the mapping between this identification and those used in
each platform. The mapping task is assured by special wrapper. In
addition, ontology is proposed to give a clear presentation of the
structure of the P2P system without interesting in the content and the
resource managed by the peer. The ontology is created according to
the web semantic paradigm and using OWL language; so, the
structure of the system is considered as a web resource.
Abstract: This paper describes a novel projection algorithm, the Projection Onto Span Algorithm (POSA) for wavelet-based superresolution and removing speckle (in wavelet domain) of unknown variance from Synthetic Aperture Radar (SAR) images. Although the POSA is good as a new superresolution algorithm for image enhancement, image metrology and biometric identification, here one will use it like a tool of despeckling, being the first time that an algorithm of super-resolution is used for despeckling of SAR images. Specifically, the speckled SAR image is decomposed into wavelet subbands; POSA is applied to the high subbands, and reconstruct a SAR image from the modified detail coefficients. Experimental results demonstrate that the new method compares favorably to several other despeckling methods on test SAR images.
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: Trihalomethanes (THMs) were among the first
disinfection byproducts to be discovered in chlorinated water. The
substances form during a reaction between chlorine and organic
matter in the water. Trihalomethanes are suspected to have negative
effects on birth such as, low birth weight, intrauterine growth
retardation in term births, as well as gestational age and preterm
delivery. There are also some evidences showing these by-products to
be mutagenic and carcinogenic, the greatest amount of evidence being
related to the bladder cancer. However, there exist inconsistencies
regarding such effects of THMs as different studies have provided
different results in this regard. The aim of the present study is to
provide a review of the related researches about the above mentioned
health effects of THMs.
Abstract: This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.
Abstract: Snake bite cases in Malaysia most often involve the
species Naja-naja and Calloselasma rhodostoma. In keeping with the
need for a rapid snake venom detection kit in a clinical setting, plate
and dot-ELISA test for the venoms of Naja-naja sumatrana,
Calloselasma rhodostoma and the cobra venom fraction V antigen
was developed. Polyclonal antibodies were raised and further used to
prepare the reagents for the dot-ELISA test kit which was tested in
mice, rabbit and virtual human models. The newly developed dot-
ELISA kit was able to detect a minimum venom concentration of
244ng/ml with cross reactivity of one antibody type. The dot-ELISA
system was sensitive and specific for all three snake venom types in
all tested animal models. The lowest minimum venom concentration
detectable was in the rabbit model, 244ng/ml of the cobra venom
fraction V antigen. The highest minimum venom concentration was
in mice, 1953ng/ml against a multitude of venoms. The developed
dot-ELISA system for the detection of three snake venom types was
successful with a sensitivity of 95.8% and specificity of 97.9%.
Abstract: The objective of the presented work is to implement the Kalman Filter into an application that reduces the influence of the environmental changes over the robot expected to navigate over a terrain of varying friction properties. The Discrete Kalman Filter is used to estimate the robot position, project the estimated current state ahead at time through time update and adjust the projected estimated state by an actual measurement at that time via the measurement update using the data coming from the infrared sensors, ultrasonic sensors and the visual sensor respectively. The navigation test has been performed in a real world environment and has been found to be robust.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: Scheduling of diversified service requests in
distributed computing is a critical design issue. Cloud is a type of
parallel and distributed system consisting of a collection of
interconnected and virtual computers. It is not only the clusters and
grid but also it comprises of next generation data centers. The paper
proposes an initial heuristic algorithm to apply modified ant colony
optimization approach for the diversified service allocation and
scheduling mechanism in cloud paradigm. The proposed optimization
method is aimed to minimize the scheduling throughput to service all
the diversified requests according to the different resource allocator
available under cloud computing environment.
Abstract: Least Significant Bit (LSB) technique is the earliest
developed technique in watermarking and it is also the most simple,
direct and common technique. It essentially involves embedding the
watermark by replacing the least significant bit of the image data with
a bit of the watermark data. The disadvantage of LSB is that it is not
robust against attacks. In this study intermediate significant bit (ISB)
has been used in order to improve the robustness of the watermarking
system. The aim of this model is to replace the watermarked image
pixels by new pixels that can protect the watermark data against
attacks and at the same time keeping the new pixels very close to the
original pixels in order to protect the quality of watermarked image.
The technique is based on testing the value of the watermark pixel
according to the range of each bit-plane.
Abstract: An experimental campaign of measurements for a
Darrieus vertical-axis wind turbine (VAWT) is presented for open
field conditions. The turbine is characterized by a twisted bladed
design, each blade being placed at a fixed distance from the rotational
shaft. The experimental setup to perform the acquisitions is described.
The results are lower than expected, due to the high influence of the
wind shear.
Abstract: This paper suggests a rethinking of the existing
research about Genetically Modified (GM) food. Since the first batch
of GM food was commercialised in the UK market, GM food rapidly
received and lost media attention in the UK. Disagreement on GM
food policy between the US and the EU has also drawn scholarly
attention to this issue. Much research has been carried out intending to
understand people-s views about GM food and the shaping of these
views. This paper was based on the data collected in twenty-nine
semi-structured interviews, which were examined through Erving
Goffman-s idea of self-presentation in interactions to suggest that the
existing studies investigating “consumer attitudes" towards GM food
have only considered the “front stage" in the dramaturgic metaphor.
This paper suggests that the ways in which people choose to present
themselves when participating these studies should be taken into
account during the data analysis.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.
Abstract: The element of justice or al-‘adl in the context of
Islamic critical thinking deals with the notion of justice in a thinking
process which critically rationalizes the truth in a fair and objective
manner with no irrelevant interference that can jeopardize a sound
judgment. This Islamic axiological element is vital in technological
decision making as it addresses the issues of religious values and
ethics that are primarily set to fulfill the purpose of human life on
earth. The main objective of this study was to examine and analyze
the perception of Muslim engineering students in Malaysian higher
education institutions towards the concept of al-‘adl as an essential
element of Islamic critical thinking. The study employed mixed
methods approach that comprises data collection from the
questionnaire survey and the interview responses. A total of 557
Muslim engineering undergraduates from six Malaysian universities
participated in the study. The study generally indicated that Muslim
engineering undergraduates in the higher institutions have rather
good comprehension and consciousness for al-‘adl with a slight
awareness on the importance of objective thinking. Nonetheless there
were a few items on the concept that have implied a comparatively
low perception on the rational justice in Islam as the means to grasp
the ultimate truth.
Abstract: This paper attempts to explore the phenomenon of metaphorization in English newspaper headlines from the perspective of pragmatic investigation. With relevance theory as the guideline, this paper makes an explanation of the processing of metaphor with a pragmatic approach and points that metaphor is the stimulus adopted by journalists to achieve optimal relevance in this ostensive communication, as well as the strategy to fulfill their writing purpose.
Abstract: Support Vector Domain Description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. As all kernel-based learning algorithms its performance depends heavily on the proper choice of the kernel parameter. This paper proposes a new approach to select kernel's parameter based on maximizing the distance between both gravity centers of normal and abnormal classes, and at the same time minimizing the variance within each class. The performance of the proposed algorithm is evaluated on several benchmarks. The experimental results demonstrate the feasibility and the effectiveness of the presented method.
Abstract: Soil stabilization has been widely used to improve
soil strength and durability or to prevent erosion and dust generation.
Generally to reduce problems of clayey soils in engineering work and
to stabilize these soils additional materials are used. The most
common materials are lime, fly ash and cement. Using this materials,
although improve soil property , but in some cases due to financial
problems and the need to use special equipment are limited .One of
the best methods for stabilization clayey soils is neutralization the
clay particles. For this purpose we can use ion exchange materials.
Ion exchange solution like CBR plus can be used for soil
stabilization. One of the most important things in using CBR plus is
determination the amount of this solution for various soils with
different properties. In this study a laboratory experiment is conduct
to evaluate the ion exchange capacity of three soils with various
plasticity index (PI) to determine amount or required CBR plus
solution for soil stabilization.
Abstract: Wind farms (WFs) with high level of penetration are
being established in power systems worldwide more rapidly than
other renewable resources. The Independent System Operator (ISO),
as a policy maker, should propose appropriate places for WF
installation in order to maximize the benefits for the investors. There
is also a possibility of congestion relief using the new installation of
WFs which should be taken into account by the ISO when proposing
the locations for WF installation. In this context, efficient wind farm
(WF) placement method is proposed in order to reduce burdens on
congested lines. Since the wind speed is a random variable and load
forecasts also contain uncertainties, probabilistic approaches are used
for this type of study. AC probabilistic optimal power flow (P-OPF)
is formulated and solved using Monte Carlo Simulations (MCS). In
order to reduce computation time, point estimate methods (PEM) are
introduced as efficient alternative for time-demanding MCS.
Subsequently, WF optimal placement is determined using generation
shift distribution factors (GSDF) considering a new parameter
entitled, wind availability factor (WAF). In order to obtain more
realistic results, N-1 contingency analysis is employed to find the
optimal size of WF, by means of line outage distribution factors
(LODF). The IEEE 30-bus test system is used to show and compare
the accuracy of proposed methodology.
Abstract: Australian government agencies have a natural desire
to provide migrants a wide range of opportunities. Consequently,
government online services should be equally available to migrants
with a non-English speaking background (NESB). Despite the
commendable efforts of governments and local agencies in Australia
to provide such services, in reality, many NESB communities are not
taking advantage of these services. This article–based on an
extensive case study regarding the use of online government services
by the Arabic NESB community in Australia–reports on the
possible reasons for this issue, as well as suggestions for
improvement. The conclusion is that Australia should implement
ICT-based or e-government policies, programmes, and services that
more accurately reflect migrant cultures and languages so that
migrant integration can be more fully accomplished. Specifically, this
article presents an NESB Model that adopts the value of usercentricity
or a more individual-focused approach to government
online services in Australia.
Abstract: In this paper, the problem of stability analysis for a class of impulsive stochastic fuzzy neural networks with timevarying delays and reaction-diffusion is considered. By utilizing suitable Lyapunov-Krasovskii funcational, the inequality technique and stochastic analysis technique, some sufficient conditions ensuring global exponential stability of equilibrium point for impulsive stochastic fuzzy cellular neural networks with time-varying delays and diffusion are obtained. In particular, the estimate of the exponential convergence rate is also provided, which depends on system parameters, diffusion effect and impulsive disturbed intention. It is believed that these results are significant and useful for the design and applications of fuzzy neural networks. An example is given to show the effectiveness of the obtained results.