Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: Trihalomethanes (THMs) were among the first
disinfection byproducts to be discovered in chlorinated water. The
substances form during a reaction between chlorine and organic
matter in the water. Trihalomethanes are suspected to have negative
effects on birth such as, low birth weight, intrauterine growth
retardation in term births, as well as gestational age and preterm
delivery. There are also some evidences showing these by-products to
be mutagenic and carcinogenic, the greatest amount of evidence being
related to the bladder cancer. However, there exist inconsistencies
regarding such effects of THMs as different studies have provided
different results in this regard. The aim of the present study is to
provide a review of the related researches about the above mentioned
health effects of THMs.
Abstract: This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.
Abstract: Snake bite cases in Malaysia most often involve the
species Naja-naja and Calloselasma rhodostoma. In keeping with the
need for a rapid snake venom detection kit in a clinical setting, plate
and dot-ELISA test for the venoms of Naja-naja sumatrana,
Calloselasma rhodostoma and the cobra venom fraction V antigen
was developed. Polyclonal antibodies were raised and further used to
prepare the reagents for the dot-ELISA test kit which was tested in
mice, rabbit and virtual human models. The newly developed dot-
ELISA kit was able to detect a minimum venom concentration of
244ng/ml with cross reactivity of one antibody type. The dot-ELISA
system was sensitive and specific for all three snake venom types in
all tested animal models. The lowest minimum venom concentration
detectable was in the rabbit model, 244ng/ml of the cobra venom
fraction V antigen. The highest minimum venom concentration was
in mice, 1953ng/ml against a multitude of venoms. The developed
dot-ELISA system for the detection of three snake venom types was
successful with a sensitivity of 95.8% and specificity of 97.9%.
Abstract: This research deals with a flexible flowshop
scheduling problem with arrival and delivery of jobs in groups and
processing them individually. Due to the special characteristics of
each job, only a subset of machines in each stage is eligible to
process that job. The objective function deals with minimization of
sum of the completion time of groups on one hand and minimization
of sum of the differences between completion time of jobs and
delivery time of the group containing that job (waiting period) on the
other hand. The problem can be stated as FFc / rj , Mj / irreg which
has many applications in production and service industries. A
mathematical model is proposed, the problem is proved to be NPcomplete,
and an effective heuristic method is presented to schedule
the jobs efficiently. This algorithm can then be used within the body
of any metaheuristic algorithm for solving the problem.
Abstract: The objective of the presented work is to implement the Kalman Filter into an application that reduces the influence of the environmental changes over the robot expected to navigate over a terrain of varying friction properties. The Discrete Kalman Filter is used to estimate the robot position, project the estimated current state ahead at time through time update and adjust the projected estimated state by an actual measurement at that time via the measurement update using the data coming from the infrared sensors, ultrasonic sensors and the visual sensor respectively. The navigation test has been performed in a real world environment and has been found to be robust.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: This article explores the sociological perspectives on
social problems and the role of the media which has a delicate role to
tread in balancing its duty to the public and the victim Whilst social
problems have objective conditions, it is the subjective definition of
such problems that ensure which social problem comes to the fore
and which doesn-t. Further it explores the roles and functions of
policymakers when addressing social problems and the impact of the
inception of media profiling as well as the advantages and
disadvantages of media profiling towards social problems. It focuses
on the inception of media profiling due to its length and a follow up
article will explore how current media profiling towards social
problems have evolved since its inception.
Abstract: This paper presented a modified efficient inductive
powering link based on ASK modulator and proposed efficient class-
E power amplifier. The design presents the external part which is
located outside the body to transfer power and data to the implanted
devices such as implanted Microsystems to stimulate and monitoring
the nerves and muscles. The system operated with low band
frequency 10MHZ according to industrial- scientific – medical (ISM)
band to avoid the tissue heating. For external part, the modulation
index is 11.1% and the modulation rate 7.2% with data rate 1 Mbit/s
assuming Tbit = 1us. The system has been designed using 0.35-μm
fabricated CMOS technology. The mathematical model is given and
the design is simulated using OrCAD P Spice 16.2 software tool and
for real-time simulation, the electronic workbench MULISIM 11 has
been used.
Abstract: TUSAT is a prospective Turkish
Communication Satellite designed for providing mainly data
communication and broadcasting services through Ku-Band
and C-Band channels. Thermal control is a vital issue in
satellite design process. Therefore, all satellite subsystems and
equipments should be maintained in the desired temperature
range from launch to end of maneuvering life. The main
function of the thermal control is to keep the equipments and
the satellite structures in a given temperature range for various
phases and operating modes of spacecraft during its lifetime.
This paper describes the thermal control design which uses
passive and active thermal control concepts. The active
thermal control is based on heaters regulated by software via
thermistors. Alternatively passive thermal control composes of
heat pipes, multilayer insulation (MLI) blankets, radiators,
paints and surface finishes maintaining temperature level of
the overall carrier components within an acceptable value.
Thermal control design is supported by thermal analysis using
thermal mathematical models (TMM).
Abstract: This paper suggests a rethinking of the existing
research about Genetically Modified (GM) food. Since the first batch
of GM food was commercialised in the UK market, GM food rapidly
received and lost media attention in the UK. Disagreement on GM
food policy between the US and the EU has also drawn scholarly
attention to this issue. Much research has been carried out intending to
understand people-s views about GM food and the shaping of these
views. This paper was based on the data collected in twenty-nine
semi-structured interviews, which were examined through Erving
Goffman-s idea of self-presentation in interactions to suggest that the
existing studies investigating “consumer attitudes" towards GM food
have only considered the “front stage" in the dramaturgic metaphor.
This paper suggests that the ways in which people choose to present
themselves when participating these studies should be taken into
account during the data analysis.
Abstract: In the present work, we propose a new technique to
enhance the learning capabilities and reduce the computation
intensity of a competitive learning multi-layered neural network
using the K-means clustering algorithm. The proposed model use
multi-layered network architecture with a back propagation learning
mechanism. The K-means algorithm is first applied to the training
dataset to reduce the amount of samples to be presented to the neural
network, by automatically selecting an optimal set of samples. The
obtained results demonstrate that the proposed technique performs
exceptionally in terms of both accuracy and computation time when
applied to the KDD99 dataset compared to a standard learning
schema that use the full dataset.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.
Abstract: This paper simulates the ad-hoc mesh network in rural areas, where such networks receive great attention due to their cost, since installing the infrastructure for regular networks in these areas is not possible due to the high cost. The distance between the communicating nodes is the most obstacles that the ad-hoc mesh network will face. For example, in Terranet technology, two nodes can communicate if they are only one kilometer far from each other. However, if the distance between them is more than one kilometer, then each node in the ad-hoc mesh networks has to act as a router that forwards the data it receives to other nodes. In this paper, we try to find the critical number of nodes which makes the network fully connected in a particular area, and then propose a method to enhance the intermediate node to accept to be a router to forward the data from the sender to the receiver. Much work was done on technological changes on peer to peer networks, but the focus of this paper will be on another feature which is to find the minimum number of nodes needed for a particular area to be fully connected and then to enhance the users to switch on their phones and accept to work as a router for other nodes. Our method raises the successful calls to 81.5% out of 100% attempt calls.
Abstract: The element of justice or al-‘adl in the context of
Islamic critical thinking deals with the notion of justice in a thinking
process which critically rationalizes the truth in a fair and objective
manner with no irrelevant interference that can jeopardize a sound
judgment. This Islamic axiological element is vital in technological
decision making as it addresses the issues of religious values and
ethics that are primarily set to fulfill the purpose of human life on
earth. The main objective of this study was to examine and analyze
the perception of Muslim engineering students in Malaysian higher
education institutions towards the concept of al-‘adl as an essential
element of Islamic critical thinking. The study employed mixed
methods approach that comprises data collection from the
questionnaire survey and the interview responses. A total of 557
Muslim engineering undergraduates from six Malaysian universities
participated in the study. The study generally indicated that Muslim
engineering undergraduates in the higher institutions have rather
good comprehension and consciousness for al-‘adl with a slight
awareness on the importance of objective thinking. Nonetheless there
were a few items on the concept that have implied a comparatively
low perception on the rational justice in Islam as the means to grasp
the ultimate truth.
Abstract: This paper attempts to explore the phenomenon of metaphorization in English newspaper headlines from the perspective of pragmatic investigation. With relevance theory as the guideline, this paper makes an explanation of the processing of metaphor with a pragmatic approach and points that metaphor is the stimulus adopted by journalists to achieve optimal relevance in this ostensive communication, as well as the strategy to fulfill their writing purpose.
Abstract: In very narrow pathways, the speed of sound propagation and the phase of sound waves change due to the air viscosity. We have developed a new finite element method (FEM) that includes the effects of air viscosity for modeling a narrow sound pathway. This method is developed as an extension of the existing FEM for porous sound-absorbing materials. The numerical calculation results for several three-dimensional slit models using the proposed FEM are validated against existing calculation methods.
Abstract: Soil stabilization has been widely used to improve
soil strength and durability or to prevent erosion and dust generation.
Generally to reduce problems of clayey soils in engineering work and
to stabilize these soils additional materials are used. The most
common materials are lime, fly ash and cement. Using this materials,
although improve soil property , but in some cases due to financial
problems and the need to use special equipment are limited .One of
the best methods for stabilization clayey soils is neutralization the
clay particles. For this purpose we can use ion exchange materials.
Ion exchange solution like CBR plus can be used for soil
stabilization. One of the most important things in using CBR plus is
determination the amount of this solution for various soils with
different properties. In this study a laboratory experiment is conduct
to evaluate the ion exchange capacity of three soils with various
plasticity index (PI) to determine amount or required CBR plus
solution for soil stabilization.
Abstract: The knowledge base of welding defect recognition is
essentially incomplete. This characteristic determines that the recognition results do not reflect the actual situation. It also has a further influence on the classification of welding quality. This paper is
concerned with the study of a rough set based method to reduce the influence and improve the classification accuracy. At first, a rough set
model of welding quality intelligent classification has been built. Both condition and decision attributes have been specified. Later on, groups
of the representative multiple compound defects have been chosen
from the defect library and then classified correctly to form the
decision table. Finally, the redundant information of the decision table has been reducted and the optimal decision rules have been reached. By this method, we are able to reclassify the misclassified defects to
the right quality level. Compared with the ordinary ones, this method
has higher accuracy and better robustness.
Abstract: Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.