Abstract: In this paper, we present the recently implemented approach allowing dynamics systems to plan its actions, taking into account the environment perception changes, and to control their execution when uncertainty and incomplete knowledge are the major characteristics of the situated environment [1],[2],[3],[4]. The control distributed architecture has three modules and the approach is related to hierarchical planning: the plan produced by the planner is further refined at the control layer that in turn supervises its execution by a functional level. We propose a new intelligent distributed architecture constituted by: Multi-Agent subsystem of the sensor, of the interpretation and representation of environment [9], of the dynamic localization and of the action. We tested this distributed architecture with dynamic system in the known environment. The autonomous for Rotor Mini Rotorcraft task is described by the primitive actions. The distributed controlbased on multi-agent system is in charge of achieving each task in the best possible way taking into account the context and sensory feedback.
Abstract: The purpose of this research is to disentangle and
validate the underlying factorial-structure of Ecotourism Experiential
Value (EEV) measurement scale and subsequently investigate its
psychometric properties. The analysis was based on a sample of 225
eco-tourists, collected at the vicinity of Taman Negara National Park
(TNNP) via interviewer-administered questionnaire. Exploratory
factor analysis (EFA) was performed to determine the factorial
structure of EEV. Subsequently, to confirm and validate the factorial
structure and assess the psychometric properties of EEV,
confirmatory factor analysis (CFA) was executed. In addition, to
establish the nomological validity of EEV a structural model was
developed to examine the effect of EEV on Total Eco-tourist
Experience Quality (TEEQ). It is unveiled that EEV is a secondorder
six-factorial structure construct and it scale has adequately met
the psychometric criteria, thus could permit interpretation of results
confidently. The findings have important implications for future
research directions and management of ecotourism destination.
Abstract: Probabilistic techniques in computer programs are becoming
more and more widely used. Therefore, there is a big
interest in the formal specification, verification, and development
of probabilistic programs. In our work-in-progress project, we are
attempting to make a constructive framework for developing probabilistic
programs formally. The main contribution of this paper
is to introduce an intermediate artifact of our work, a Z-based
formalism called PZ, by which one can build set theoretical models of
probabilistic programs. We propose to use a constructive set theory,
called CZ set theory, to interpret the specifications written in PZ.
Since CZ has an interpretation in Martin-L¨of-s theory of types, this
idea enables us to derive probabilistic programs from correctness
proofs of their PZ specifications.
Abstract: Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.
Abstract: Imaging is defined as the process of obtaining
geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin
surface in medical application. This research focuses on analyzing
and determining volume of leg ulcers using imaging devices. Volume
determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the
indication on responding to treatment whether healing or worsening.
Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection
algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the
treatment efficacy. The objectives of this paper is to compare the
accuracy between two 3D data acquisition method, which is laser
triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique
produces better accuracy compared with laser triangulation data
acquisition method for leg ulcer volume determination.
Abstract: There is an ongoing controversy in the literature related
to the biological effects of weak, low frequency electromagnetic
fields. The physical arguments and interpretation of the experimental
evidence are inconsistent, where some physical arguments and
experimental demonstrations tend to reject the likelihood of any
effect of the fields at extremely low level. The problem arises of
explaining, how the low-energy influences of weak magnetic fields
can compete with the thermal and electrical noise of cells at normal
temperature using the theoretical studies. The magnetoreception in
animals involve radical pair mechanism. The same mechanism has
been shown to be involved in the circadian rhythm synchronization in
mammals. These reactions can be influenced by the weak magnetic
fields. Hence, it is postulated the biological clock can be affected
by weak magnetic fields and these disruptions to the rhythm can
cause adverse biological effects. In this paper, likelihood of altering
the biological clock via the radical pair mechanism is analyzed to
simplify these studies of controversy.
Abstract: Application of Geo-Informatic technology in land
tenure and land use on the economic crop area, to create sustainable
land, access to the area, and produce sustainable food for the demand
of its people in the community. The research objectives are to 1)
apply Geo-Informatic Technology on land ownership and agricultural
land use (cash crops) in the research area, 2) create GIS database on
land ownership and land use, 3) create database of an online Geoinformation
system on land tenure and land use. The results of this
study reveal that, first; the study area is on high slope, mountains and
valleys. The land is mainly in the forest zone which was included in
the Forest Act 1941 and National Conserved Forest 1964. Residents
gained the rights to exploit the land passed down from their
ancestors. The practice was recognized by communities. The land
was suitable for cultivating a wide variety of economic crops that was
the main income of the family. At present the local residents keep
expanding the land to grow cash crops. Second; creating a database
of the geographic information system consisted of the area range,
announcement from the Interior Ministry, interpretation of satellite
images, transportation routes, waterways, plots of land with a title
deed available at the provincial land office. Most pieces of land
without a title deed are located in the forest and national reserve
areas. Data were created from a field study and a land zone
determined by a GPS. Last; an online Geo-Informatic System can
show the information of land tenure and land use of each economic
crop. Satellite data with high resolution which could be updated and
checked on the online Geo-Informatic System simultaneously.
Abstract: This paper attempts to identify the significance of
Information and Communications Technology (ICT) and
competitiveness to the profit efficiency of commercial banks in
Malaysia. The profit efficiency of commercial banks in Malaysia, the
dependent variable, was estimated using the Stochastic Frontier
Approach (SFA) on a sample of unbalanced panel data, covering 23
commercial banks, between 1995 to 2007. Based on the empirical
results, ICT was not found to exert a significant impact on profit
efficiency, whereas competitiveness, non ICT stock expenditure and
ownership were significant contributors. On the other hand, the size
of banks was found to have significantly reduced profit efficiency,
opening up for various interpretations of the interrelated role of ICT
and competition.
Abstract: The basis of this paper is the assumption, that graviton
is a measurable entity of molecular gravitational acceleration and this
is not a hypothetical entity. The adoption of this assumption as an
axiom is tantamount to fully opening the previously locked door to
the boundary theory between laminar and turbulent flows. It leads to
the theorem, that the division of flows of Newtonian (viscous) fluids
into laminar and turbulent is true only, if the fluid is influenced by a
powerful, external force field. The mathematical interpretation of this
theorem, presented in this paper shows, that the boundary between
laminar and turbulent flow can be determined theoretically. This is a
novelty, because thus far the said boundary was determined
empirically only and the reasons for its existence were unknown.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: Interpretation of aerial images is an important task in
various applications. Image segmentation can be viewed as the essential
step for extracting information from aerial images. Among many
developed segmentation methods, the technique of clustering has been
extensively investigated and used. However, determining the number
of clusters in an image is inherently a difficult problem, especially
when a priori information on the aerial image is unavailable. This
study proposes a support vector machine approach for clustering
aerial images. Three cluster validity indices, distance-based index,
Davies-Bouldin index, and Xie-Beni index, are utilized as quantitative
measures of the quality of clustering results. Comparisons on the
effectiveness of these indices and various parameters settings on the
proposed methods are conducted. Experimental results are provided
to illustrate the feasibility of the proposed approach.
Abstract: Ramadan requires individuals to abstain from food and fluid intake between sunrise and sunset; physiological considerations predict that poorer mood, physical performance and mental performance will result. In addition, any difficulties will be worsened because preparations for fasting and recovery from it often mean that nocturnal sleep is decreased in length, and this independently affects mood and performance.
A difficulty of interpretation in many studies is that the observed changes could be due to fasting but also to the decreased length of sleep and altered food and fluid intakes before and after the daytime fasting. These factors were separated in this study, which took place over three separate days and compared the effects of different durations of fasting (4, 8 or 16h) upon a wide variety of measures (including subjective and objective assessments of performance, body composition, dehydration and responses to a short bout of exercise) - but with an unchanged amount of nocturnal sleep, controlled supper the previous evening, controlled intakes at breakfast and daytime naps not being allowed. Many of the negative effects of fasting observed in previous studies were present in this experiment also. These findings indicate that fasting was responsible for many of the changes previously observed, though some effect of sleep loss, particularly if occurring on successive days (as would occur in Ramadan) cannot be excluded.
Abstract: When reconstructing a scenario, it is necessary to
know the structure of the elements present on the scene to have an
interpretation. In this work we link 3D scenes reconstruction to
evolutionary algorithms through the vision stereo theory. We
consider vision stereo as a method that provides the reconstruction of
a scene using only a couple of images of the scene and performing
some computation. Through several images of a scene, captured from
different positions, vision stereo can give us an idea about the threedimensional
characteristics of the world. Vision stereo usually
requires of two cameras, making an analogy to the mammalian vision
system. In this work we employ only a camera, which is translated
along a path, capturing images every certain distance. As we can not
perform all computations required for an exhaustive reconstruction,
we employ an evolutionary algorithm to partially reconstruct the
scene in real time. The algorithm employed is the fly algorithm,
which employ “flies" to reconstruct the principal characteristics of
the world following certain evolutionary rules.
Abstract: In this paper we present a method for gene ranking
from DNA microarray data. More precisely, we calculate the correlation
networks, which are unweighted and undirected graphs, from
microarray data of cervical cancer whereas each network represents
a tissue of a certain tumor stage and each node in the network
represents a gene. From these networks we extract one tree for
each gene by a local decomposition of the correlation network. The
interpretation of a tree is that it represents the n-nearest neighbor
genes on the n-th level of a tree, measured by the Dijkstra distance,
and, hence, gives the local embedding of a gene within the correlation
network. For the obtained trees we measure the pairwise similarity
between trees rooted by the same gene from normal to cancerous
tissues. This evaluates the modification of the tree topology due to
progression of the tumor. Finally, we rank the obtained similarity
values from all tissue comparisons and select the top ranked genes.
For these genes the local neighborhood in the correlation networks
changes most between normal and cancerous tissues. As a result
we find that the top ranked genes are candidates suspected to be
involved in tumor growth and, hence, indicates that our method
captures essential information from the underlying DNA microarray
data of cervical cancer.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: The most common result of analysis of highthroughput
data in molecular biology represents a global list of
genes, ranked accordingly to a certain score. The score can be a
measure of differential expression. Recent work proposed a new
method for selecting a number of genes in a ranked gene list from
microarray gene expression data such that this set forms the
Optimally Functionally Enriched Network (OFTEN), formed by
known physical interactions between genes or their products. Here
we present calculation results of relative connectivity of genes from
META-OFTEN network and tentative biological interpretation of the
most reproducible signal. The relative connectivity and
inbetweenness values of genes from META-OFTEN network were
estimated.
Abstract: In seismic survey, the information regarding the
velocity of compression wave (Vp) as well as shear wave (Vs) are
very useful especially during the seismic interpretation. Previous
studies showed that both Vp and Vs determined by above methods
are totally different with respect to each other but offered good
approximation. In this study, both Vp and Vs of consolidated granite
rock were studied by using ultrasonic testing method and seismic
refraction method. In ultrasonic testing, two different condition of
rock are used which is dry and wet. The differences between Vp and
Vs getting by using ultrasonic testing and seismic refraction were
investigated and studied. The effect of water content in granite rock
towards the value of Vp and Vs during ultrasonic testing are also
measured. Within this work, the tolerance of the differences between
the velocity of seismic wave getting from ultrasonic testing and the
velocity of seismic wave getting from seismic refraction are also
measured and investigated.
Abstract: This paper is mainly concerned with the application of a novel technique of data interpretation to the characterization and classification of measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artifical Neural Networks have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compares with earlier methods.