Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: This research deals with a flexible flowshop
scheduling problem with arrival and delivery of jobs in groups and
processing them individually. Due to the special characteristics of
each job, only a subset of machines in each stage is eligible to
process that job. The objective function deals with minimization of
sum of the completion time of groups on one hand and minimization
of sum of the differences between completion time of jobs and
delivery time of the group containing that job (waiting period) on the
other hand. The problem can be stated as FFc / rj , Mj / irreg which
has many applications in production and service industries. A
mathematical model is proposed, the problem is proved to be NPcomplete,
and an effective heuristic method is presented to schedule
the jobs efficiently. This algorithm can then be used within the body
of any metaheuristic algorithm for solving the problem.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: This paper explores university course timetabling
problem. There are several characteristics that make scheduling and
timetabling problems particularly difficult to solve: they have huge
search spaces, they are often highly constrained, they require
sophisticated solution representation schemes, and they usually
require very time-consuming fitness evaluation routines. Thus
standard evolutionary algorithms lack of efficiency to deal with
them. In this paper we have proposed a memetic algorithm that
incorporates the problem specific knowledge such that most of
chromosomes generated are decoded into feasible solutions.
Generating vast amount of feasible chromosomes makes the progress
of search process possible in a time efficient manner. Experimental
results exhibit the advantages of the developed Hybrid Genetic
Algorithm than the standard Genetic Algorithm.
Abstract: This paper presented a modified efficient inductive
powering link based on ASK modulator and proposed efficient class-
E power amplifier. The design presents the external part which is
located outside the body to transfer power and data to the implanted
devices such as implanted Microsystems to stimulate and monitoring
the nerves and muscles. The system operated with low band
frequency 10MHZ according to industrial- scientific – medical (ISM)
band to avoid the tissue heating. For external part, the modulation
index is 11.1% and the modulation rate 7.2% with data rate 1 Mbit/s
assuming Tbit = 1us. The system has been designed using 0.35-μm
fabricated CMOS technology. The mathematical model is given and
the design is simulated using OrCAD P Spice 16.2 software tool and
for real-time simulation, the electronic workbench MULISIM 11 has
been used.
Abstract: TUSAT is a prospective Turkish
Communication Satellite designed for providing mainly data
communication and broadcasting services through Ku-Band
and C-Band channels. Thermal control is a vital issue in
satellite design process. Therefore, all satellite subsystems and
equipments should be maintained in the desired temperature
range from launch to end of maneuvering life. The main
function of the thermal control is to keep the equipments and
the satellite structures in a given temperature range for various
phases and operating modes of spacecraft during its lifetime.
This paper describes the thermal control design which uses
passive and active thermal control concepts. The active
thermal control is based on heaters regulated by software via
thermistors. Alternatively passive thermal control composes of
heat pipes, multilayer insulation (MLI) blankets, radiators,
paints and surface finishes maintaining temperature level of
the overall carrier components within an acceptable value.
Thermal control design is supported by thermal analysis using
thermal mathematical models (TMM).
Abstract: Scheduling of diversified service requests in
distributed computing is a critical design issue. Cloud is a type of
parallel and distributed system consisting of a collection of
interconnected and virtual computers. It is not only the clusters and
grid but also it comprises of next generation data centers. The paper
proposes an initial heuristic algorithm to apply modified ant colony
optimization approach for the diversified service allocation and
scheduling mechanism in cloud paradigm. The proposed optimization
method is aimed to minimize the scheduling throughput to service all
the diversified requests according to the different resource allocator
available under cloud computing environment.
Abstract: Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Abstract: This paper suggests a rethinking of the existing
research about Genetically Modified (GM) food. Since the first batch
of GM food was commercialised in the UK market, GM food rapidly
received and lost media attention in the UK. Disagreement on GM
food policy between the US and the EU has also drawn scholarly
attention to this issue. Much research has been carried out intending to
understand people-s views about GM food and the shaping of these
views. This paper was based on the data collected in twenty-nine
semi-structured interviews, which were examined through Erving
Goffman-s idea of self-presentation in interactions to suggest that the
existing studies investigating “consumer attitudes" towards GM food
have only considered the “front stage" in the dramaturgic metaphor.
This paper suggests that the ways in which people choose to present
themselves when participating these studies should be taken into
account during the data analysis.
Abstract: This paper deals with the helical flow of a Newtonian
fluid in an infinite circular cylinder, due to both longitudinal and
rotational shear stress. The velocity field and the resulting shear
stress are determined by means of the Laplace and finite Hankel
transforms and satisfy all imposed initial and boundary conditions.
For large times, these solutions reduce to the well-known steady-state
solutions.
Abstract: In the present work, we propose a new technique to
enhance the learning capabilities and reduce the computation
intensity of a competitive learning multi-layered neural network
using the K-means clustering algorithm. The proposed model use
multi-layered network architecture with a back propagation learning
mechanism. The K-means algorithm is first applied to the training
dataset to reduce the amount of samples to be presented to the neural
network, by automatically selecting an optimal set of samples. The
obtained results demonstrate that the proposed technique performs
exceptionally in terms of both accuracy and computation time when
applied to the KDD99 dataset compared to a standard learning
schema that use the full dataset.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.
Abstract: This paper simulates the ad-hoc mesh network in rural areas, where such networks receive great attention due to their cost, since installing the infrastructure for regular networks in these areas is not possible due to the high cost. The distance between the communicating nodes is the most obstacles that the ad-hoc mesh network will face. For example, in Terranet technology, two nodes can communicate if they are only one kilometer far from each other. However, if the distance between them is more than one kilometer, then each node in the ad-hoc mesh networks has to act as a router that forwards the data it receives to other nodes. In this paper, we try to find the critical number of nodes which makes the network fully connected in a particular area, and then propose a method to enhance the intermediate node to accept to be a router to forward the data from the sender to the receiver. Much work was done on technological changes on peer to peer networks, but the focus of this paper will be on another feature which is to find the minimum number of nodes needed for a particular area to be fully connected and then to enhance the users to switch on their phones and accept to work as a router for other nodes. Our method raises the successful calls to 81.5% out of 100% attempt calls.
Abstract: Ultrathin (UTD) and Nanoscale (NSD) SOI-MOSFET devices, sharing a similar W/L but with a channel thickness of 46nm and 1.6nm respectively, were fabricated using a selective “gate recessed” process on the same silicon wafer. The electrical transport characterization at room temperature has shown a large difference between the two kinds of devices and has been interpreted in terms of a huge unexpected series resistance. Electrical characteristics of the Nanoscale device, taken in the linear region, can be analytically derived from the ultrathin device ones. A comparison of the structure and composition of the layers, using advanced techniques such as Focused Ion Beam (FIB) and High Resolution TEM (HRTEM) coupled with Energy Dispersive X-ray Spectroscopy (EDS), contributes an explanation as to the difference of transport between the devices.
Abstract: This paper attempts to explore the phenomenon of metaphorization in English newspaper headlines from the perspective of pragmatic investigation. With relevance theory as the guideline, this paper makes an explanation of the processing of metaphor with a pragmatic approach and points that metaphor is the stimulus adopted by journalists to achieve optimal relevance in this ostensive communication, as well as the strategy to fulfill their writing purpose.
Abstract: Support Vector Domain Description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. As all kernel-based learning algorithms its performance depends heavily on the proper choice of the kernel parameter. This paper proposes a new approach to select kernel's parameter based on maximizing the distance between both gravity centers of normal and abnormal classes, and at the same time minimizing the variance within each class. The performance of the proposed algorithm is evaluated on several benchmarks. The experimental results demonstrate the feasibility and the effectiveness of the presented method.
Abstract: Soil stabilization has been widely used to improve
soil strength and durability or to prevent erosion and dust generation.
Generally to reduce problems of clayey soils in engineering work and
to stabilize these soils additional materials are used. The most
common materials are lime, fly ash and cement. Using this materials,
although improve soil property , but in some cases due to financial
problems and the need to use special equipment are limited .One of
the best methods for stabilization clayey soils is neutralization the
clay particles. For this purpose we can use ion exchange materials.
Ion exchange solution like CBR plus can be used for soil
stabilization. One of the most important things in using CBR plus is
determination the amount of this solution for various soils with
different properties. In this study a laboratory experiment is conduct
to evaluate the ion exchange capacity of three soils with various
plasticity index (PI) to determine amount or required CBR plus
solution for soil stabilization.
Abstract: This work deals with unsupervised image deblurring.
We present a new deblurring procedure on images provided by lowresolution
synthetic aperture radar (SAR) or simply by multimedia in
presence of multiplicative (speckle) or additive noise, respectively.
The method we propose is defined as a two-step process. First, we
use an original technique for noise reduction in wavelet domain.
Then, the learning of a Kohonen self-organizing map (SOM) is
performed directly on the denoised image to take out it the blur. This
technique has been successfully applied to real SAR images, and the
simulation results are presented to demonstrate the effectiveness of
the proposed algorithms.
Abstract: Wind farms (WFs) with high level of penetration are
being established in power systems worldwide more rapidly than
other renewable resources. The Independent System Operator (ISO),
as a policy maker, should propose appropriate places for WF
installation in order to maximize the benefits for the investors. There
is also a possibility of congestion relief using the new installation of
WFs which should be taken into account by the ISO when proposing
the locations for WF installation. In this context, efficient wind farm
(WF) placement method is proposed in order to reduce burdens on
congested lines. Since the wind speed is a random variable and load
forecasts also contain uncertainties, probabilistic approaches are used
for this type of study. AC probabilistic optimal power flow (P-OPF)
is formulated and solved using Monte Carlo Simulations (MCS). In
order to reduce computation time, point estimate methods (PEM) are
introduced as efficient alternative for time-demanding MCS.
Subsequently, WF optimal placement is determined using generation
shift distribution factors (GSDF) considering a new parameter
entitled, wind availability factor (WAF). In order to obtain more
realistic results, N-1 contingency analysis is employed to find the
optimal size of WF, by means of line outage distribution factors
(LODF). The IEEE 30-bus test system is used to show and compare
the accuracy of proposed methodology.
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.