Abstract: Field mapping activity for an active volcano mainly in
the Torrid Zone is usually hampered by several problems such as steep
terrain and bad atmosphere conditions. In this paper we present a
simple solution for such problem by a combination Synthetic Aperture
Radar (SAR) and geostatistical methods. By this combination, we
could reduce the speckle effect from the SAR data and then estimate
roughness distribution of the pyroclastic flow deposits. The main
purpose of this study is to detect spatial distribution of new pyroclastic
flow deposits termed as P-zone accurately using the β°data from two
RADARSAT-1 SAR level-0 data. Single scene of Hyperion data and
field observation were used for cross-validation of the SAR results.
Mt. Merapi in central Java, Indonesia, was chosen as a study site and
the eruptions in May-June 2006 were examined. The P-zones were
found in the western and southern flanks. The area size and the longest
flow distance were calculated as 2.3 km2 and 6.8 km, respectively. The
grain size variation of the P-zone was mapped in detail from fine to
coarse deposits regarding the C-band wavelength of 5.6 cm.
Abstract: In this paper, the noise maps for the area encircled by
the Second Ring Road in Riyadh city are developed based on real
measured data. Sound level meters, GPS receivers to determine
measurement position, a database program to manage the measured
data, and a program to develop the maps are used. A baseline noise
level has been established at each short-term site so subsequent
monitoring may be conducted to describe changes in Riyadh-s noise
environment. Short-term sites are used to show typical daytime and
nighttime noise levels at specific locations by short duration grab
sampling.
Abstract: This paper addresses the problem of building a unified
structure to describe a peer-to-peer system. Our approach uses the
well-known notations in the P2P area, and provides a global
architecture that puts a separation between the platform specific
characteristics and the logical ones. In order to enable the navigation
of the peer across platforms, a roaming layer is added. The latter
provides a capability to define a unique identification of peer and
assures the mapping between this identification and those used in
each platform. The mapping task is assured by special wrapper. In
addition, ontology is proposed to give a clear presentation of the
structure of the P2P system without interesting in the content and the
resource managed by the peer. The ontology is created according to
the web semantic paradigm and using OWL language; so, the
structure of the system is considered as a web resource.
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: The problem of mapping tasks onto a computational grid with the aim to minimize the power consumption and the makespan subject to the constraints of deadlines and architectural requirements is considered in this paper. To solve this problem, we propose a solution from cooperative game theory based on the concept of Nash Bargaining Solution. The proposed game theoretical technique is compared against several traditional techniques. The experimental results show that when the deadline constraints are tight, the proposed technique achieves superior performance and reports competitive performance relative to the optimal solution.
Abstract: CIM is the standard formalism for modeling management
information developed by the Distributed Management Task
Force (DMTF) in the context of its WBEM proposal, designed to
provide a conceptual view of the managed environment. In this
paper, we propose the inclusion of formal knowledge representation
techniques, based on Description Logics (DLs) and the Web Ontology
Language (OWL), in CIM-based conceptual modeling, and then we
examine the benefits of such a decision. The proposal is specified
as a CIM metamodel level mapping to a highly expressive subset
of DLs capable of capturing all the semantics of the models. The
paper shows how the proposed mapping provides CIM diagrams with
precise semantics and can be used for automatic reasoning about the
management information models, as a design aid, by means of newgeneration
CASE tools, thanks to the use of state-of-the-art automatic
reasoning systems that support the proposed logic and use algorithms
that are sound and complete with respect to the semantics. Such a
CASE tool framework has been developed by the authors and its
architecture is also introduced. The proposed formalization is not
only useful at design time, but also at run time through the use of
rational autonomous agents, in response to a need recently recognized
by the DMTF.
Abstract: Since the 1940s, many promising telepresence
research results have been obtained. However, telepresence
technology still has not reached industrial usage. As human
intelligence is necessary for successful execution of most manual
assembly tasks, the ability of the human is hindered in some cases,
such as the assembly of heavy parts of small/medium lots or
prototypes. In such a case of manual assembly, the help of industrial
robots is mandatory. The telepresence technology can be considered
as a solution for performing assembly tasks, where the human
intelligence and haptic sense are needed to identify and minimize the
errors during an assembly process and a robot is needed to carry
heavy parts. In this paper, preliminary steps to integrate the
telepresence technology into industrial robot systems are introduced.
The system described here combines both, the human haptic sense
and the industrial robot capability to perform a manual assembly task
remotely using a force feedback joystick. Mapping between the
joystick-s Degrees of Freedom (DOF) and the robot-s ones are
introduced. Simulation and experimental results are shown and future
work is discussed.
Abstract: In the present era of aviation technology, autonomous navigation and control have emerged as a prime area of active research. Owing to the tremendous developments in the field, autonomous controls have led today’s engineers to claim that future of aerospace vehicle is unmanned. Development of guidance and navigation algorithms for an unmanned aerial vehicle (UAV) is an extremely challenging task, which requires efforts to meet strict, and at times, conflicting goals of guidance and control. In this paper, aircraft altitude and heading controllers and an efficient algorithm for self-governing navigation using MATLAB® mapping toolbox is presented which also enables loitering of a fixed wing UAV over a specified area. For this purpose, a nonlinear mathematical model of a UAV is used. The nonlinear model is linearized around a stable trim point and decoupled for controller design. The linear controllers are tested on the nonlinear aircraft model and navigation algorithm is subsequently developed for for autonomous flight of the UAV. The results are presented for trajectory controllers and waypoint based navigation. Our investigation reveals that MATLAB® mapping toolbox can be exploited to successfully deliver an efficient algorithm for autonomous aerial navigation for a UAV.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.
Abstract: Unified Modeling Language (UML) extensions for real time embedded systems (RTES) co-design, are taking a growing interest by a great number of industrial and research communities. The extension mechanism is provided by UML profiles for RTES. It aims at improving an easily-understood method of system design for non-experts. On the other hand, one of the key items of the co- design methods is the Hardware/Software partitioning and scheduling tasks. Indeed, it is mandatory to define where and when tasks are implemented and run. Unfortunately the main goals of co-design are not included in the usual practice of UML profiles. So, there exists a need for mapping used models to an execution platform for both schedulability test and HW/SW partitioning. In the present work, test schedulability and design space exploration are performed at an early stage. The proposed approach adopts Model Driven Engineering MDE. It starts from UML specification annotated with the recent profile for the Modeling and Analysis of Real Time Embedded systems MARTE. Following refinement strategy, transformation rules allow to find a feasible schedule that satisfies timing constraints and to define where tasks will be implemented. The overall approach is experimented for the design of a football player robot application.
Abstract: Information is power. Geographical information is an
emerging science that is advancing the development of knowledge to
further help in the understanding of the relationship of “place" with
other disciplines such as crime. The researchers used crime data for
the years 2004 to 2007 from the Baguio City Police Office to
determine the incidence and actual locations of crime hotspots.
Combined qualitative and quantitative research methodology was
employed through extensive fieldwork and observation, geographic
visualization with Geographic Information Systems (GIS) and Global
Positioning Systems (GPS), and data mining. The paper discusses
emerging geographic visualization and data mining tools and
methodologies that can be used to generate baseline data for
environmental initiatives such as urban renewal and rejuvenation.
The study was able to demonstrate that crime hotspots can be
computed and were seen to be occurring to some select places in the
Central Business District (CBD) of Baguio City. It was observed that
some characteristics of the hotspot places- physical design and milieu
may play an important role in creating opportunities for crime. A list
of these environmental attributes was generated. This derived
information may be used to guide the design or redesign of the urban
environment of the City to be able to reduce crime and at the same
time improve it physically.
Abstract: Speedups from mapping four real-life DSP
applications on an embedded system-on-chip that couples coarsegrained
reconfigurable logic with an instruction-set processor are
presented. The reconfigurable logic is realized by a 2-Dimensional
Array of Processing Elements. A design flow for improving
application-s performance is proposed. Critical software parts, called
kernels, are accelerated on the Coarse-Grained Reconfigurable
Array. The kernels are detected by profiling the source code. For
mapping the detected kernels on the reconfigurable logic a prioritybased
mapping algorithm has been developed. Two 4x4 array
architectures, which differ in their interconnection structure among
the Processing Elements, are considered. The experiments for eight
different instances of a generic system show that important overall
application speedups have been reported for the four applications.
The performance improvements range from 1.86 to 3.67, with an
average value of 2.53, compared with an all-software execution.
These speedups are quite close to the maximum theoretical speedups
imposed by Amdahl-s law.
Abstract: The challenge for software development house in
Bangladesh is to find a path of using minimum process rather than CMMI or ISO type gigantic practice and process area. The small and medium size organization in Bangladesh wants to ensure minimum
basic Software Process Improvement (SPI) in day to day operational
activities. Perhaps, the basic practices will ensure to realize their company's improvement goals. This paper focuses on the key issues in basic software practices for small and medium size software
organizations, who are unable to effort the CMMI, ISO, ITIL etc. compliance certifications. This research also suggests a basic software process practices model for Bangladesh and it will show the mapping of our suggestions with international best practice. In this IT
competitive world for software process improvement, Small and medium size software companies that require collaboration and
strengthening to transform their current perspective into inseparable global IT scenario. This research performed some investigations and analysis on some projects- life cycle, current good practice, effective approach, reality and pain area of practitioners, etc. We did some
reasoning, root cause analysis, comparative analysis of various
approach, method, practice and justifications of CMMI and real life. We did avoid reinventing the wheel, where our focus is for minimal
practice, which will ensure a dignified satisfaction between
organizations and software customer.
Abstract: Schema matching plays a key role in many different
applications, such as schema integration, data integration, data
warehousing, data transformation, E-commerce, peer-to-peer data
management, ontology matching and integration, semantic Web,
semantic query processing, etc. Manual matching is expensive and
error-prone, so it is therefore important to develop techniques to
automate the schema matching process. In this paper, we present a
solution for XML schema automated matching problem which
produces semantic mappings between corresponding schema
elements of given source and target schemas. This solution
contributed in solving more comprehensively and efficiently XML
schema automated matching problem. Our solution based on
combining linguistic similarity, data type compatibility and structural
similarity of XML schema elements. After describing our solution,
we present experimental results that demonstrate the effectiveness of
this approach.
Abstract: Hazardous Material transportation by road is coupled
with inherent risk of accidents causing loss of lives, grievous injuries,
property losses and environmental damages. The most common type
of hazmat road accident happens to be the releases (78%) of
hazardous substances, followed by fires (28%), explosions (14%) and
vapour/ gas clouds (6 %.).
The paper is discussing initially the probable 'Impact Zones'
likely to be caused by one flammable (LPG) and one toxic (ethylene
oxide) chemicals being transported through a sizable segment of a
State Highway connecting three notified Industrial zones in Surat
district in Western India housing 26 MAH industrial units. Three
'hotspots' were identified along the highway segment depending on
the particular chemical traffic and the population distribution within
500 meters on either sides. The thermal radiation and explosion
overpressure have been calculated for LPG / Ethylene Oxide BLEVE
scenarios along with toxic release scenario for ethylene oxide.
Besides, the dispersion calculations for ethylene oxide toxic release
have been made for each 'hotspot' location and the impact zones
have been mapped for the LOC concentrations. Subsequently, the
maximum Initial Isolation and the protective zones were calculated
based on ERPG-3 and ERPG-2 values of ethylene oxide respectively
which are estimated taking the worst case scenario under worst
weather conditions. The data analysis will be helpful to the local
administration in capacity building with respect to rescue /
evacuation and medical preparedness and quantitative inputs to
augment the District Offsite Emergency Plan document.
Abstract: To evaluate genetic variation of wheat (Triticum
aestivum) affected by heat and drought stress on eight Australian
wheat genotypes that are parents of Doubled Haploid (HD) mapping
populations at the vegetative stage, the water stress experiment was
conducted at 65% field capacity in growth room. Heat stress
experiment was conducted in the research field under irrigation over
summer. Result show that water stress decreased dry shoot weight
and RWC but increased osmolarity and means of Fv/Fm values in all
varieties except for Krichauff. Krichauff and Kukri had the
maximum RWC under drought stress. Trident variety was shown
maximum WUE, osmolarity (610 mM/Kg), dry mater, quantum yield
and Fv/Fm 0.815 under water stress condition. However, the
recovery of quantum yield was apparent between 4 to 7 days after
stress in all varieties. Nevertheless, increase in water stress after that
lead to strong decrease in quantum yield. There was a genetic
variation for leaf pigments content among varieties under heat stress.
Heat stress decreased significantly the total chlorophyll content that
measured by SPAD. Krichauff had maximum value of Anthocyanin
content (2.978 A/g FW), chlorophyll a+b (2.001 mg/g FW) and
chlorophyll a (1.502 mg/g FW). Maximum value of chlorophyll b
(0.515 mg/g FW) and Carotenoids (0.234 mg/g FW) content
belonged to Kukri. The quantum yield of all varieties decreased
significantly, when the weather temperature increased from 28 ÔùªC to
36 ÔùªC during the 6 days. However, the recovery of quantum yield
was apparent after 8th day in all varieties. The maximum decrease
and recovery in quantum yield was observed in Krichauff. Drought
and heat tolerant and moderately tolerant wheat genotypes were
included Trident, Krichauff, Kukri and RAC875. Molineux, Berkut
and Excalibur were clustered into most sensitive and moderately
sensitive genotypes. Finally, the results show that there was a
significantly genetic variation among the eight varieties that were
studied under heat and water stress.
Abstract: Gasoline Octane Number is the standard measure of
the anti-knock properties of a motor in platforming processes, that is
one of the important unit operations for oil refineries and can be
determined with online measurement or use CFR (Cooperative Fuel
Research) engines. Online measurements of the Octane number can
be done using direct octane number analyzers, that it is too
expensive, so we have to find feasible analyzer, like ANFIS
estimators.
ANFIS is the systems that neural network incorporated in fuzzy
systems, using data automatically by learning algorithms of NNs.
ANFIS constructs an input-output mapping based both on human
knowledge and on generated input-output data pairs.
In this research, 31 industrial data sets are used (21 data for training
and the rest of the data used for generalization). Results show that,
according to this simulation, hybrid method training algorithm in
ANFIS has good agreements between industrial data and simulated
results.
Abstract: Intelligent systems based on machine learning
techniques, such as classification, clustering, are gaining wide spread
popularity in real world applications. This paper presents work on
developing a software system for predicting crop yield, for example
oil-palm yield, from climate and plantation data. At the core of our
system is a method for unsupervised partitioning of data for finding
spatio-temporal patterns in climate data using kernel methods which
offer strength to deal with complex data. This work gets inspiration
from the notion that a non-linear data transformation into some high
dimensional feature space increases the possibility of linear
separability of the patterns in the transformed space. Therefore, it
simplifies exploration of the associated structure in the data. Kernel
methods implicitly perform a non-linear mapping of the input data
into a high dimensional feature space by replacing the inner products
with an appropriate positive definite function. In this paper we
present a robust weighted kernel k-means algorithm incorporating
spatial constraints for clustering the data. The proposed algorithm
can effectively handle noise, outliers and auto-correlation in the
spatial data, for effective and efficient data analysis by exploring
patterns and structures in the data, and thus can be used for
predicting oil-palm yield by analyzing various factors affecting the
yield.
Abstract: A self tuning PID control strategy using reinforcement
learning is proposed in this paper to deal with the control of wind
energy conversion systems (WECS). Actor-Critic learning is used to
tune PID parameters in an adaptive way by taking advantage of the
model-free and on-line learning properties of reinforcement learning
effectively. In order to reduce the demand of storage space and to
improve the learning efficiency, a single RBF neural network is used
to approximate the policy function of Actor and the value function of
Critic simultaneously. The inputs of RBF network are the system
error, as well as the first and the second-order differences of error.
The Actor can realize the mapping from the system state to PID
parameters, while the Critic evaluates the outputs of the Actor and
produces TD error. Based on TD error performance index and
gradient descent method, the updating rules of RBF kernel function
and network weights were given. Simulation results show that the
proposed controller is efficient for WECS and it is perfectly
adaptable and strongly robust, which is better than that of a
conventional PID controller.
Abstract: The photonic component industry is a highly
innovative industry with a large value chain. In order to ensure the
growth of the industry much effort must be devoted to road mapping
activities. In such activities demand and price evolution forecasting
tools can prove quite useful in order to help in the roadmap
refinement and update process. This paper attempts to provide useful
guidelines in roadmapping of optical components and considers two
models based on diffusion theory and the extended learning curve for
demand and price evolution forecasting.