Abstract: This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.
Abstract: Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.
Abstract: Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.
Abstract: Complex statistical analysis of stresses in concrete
slab of the real type of rigid pavement is performed. The
computational model of the pavement is designed as a spatial (3D) model, is based on a nonlinear variant of the finite element method
that respects the structural nonlinearity, enables to model different arrangement of joints, and the entire model can be loaded by the
thermal load. Interaction of adjacent slabs in joints and contact of the slab and the subsequent layer are modeled with help of special
contact elements. Four concrete slabs separated by transverse and
longitudinal joints and the additional subgrade layers and soil to the depth of about 3m are modeled. The thickness of individual layers,
physical and mechanical properties of materials, characteristics of
joints, and the temperature of the upper and lower surface of slabs are supposed to be random variables. The modern simulation technique
Updated Latin Hypercube Sampling with 20 simulations is used for statistical analysis. As results, the estimates of basic statistics of the
principal stresses s1 and s3 in 53 points on the upper and lower surface of the slabs are obtained.
Abstract: This paper presents a design method of self-tuning
Quantitative Feedback Theory (QFT) by using improved deadbeat
control algorithm. QFT is a technique to achieve robust control with
pre-defined specifications whereas deadbeat is an algorithm that
could bring the output to steady state with minimum step size.
Nevertheless, usually there are large peaks in the deadbeat response.
By integrating QFT specifications into deadbeat algorithm, the large
peaks could be tolerated. On the other hand, emerging QFT with
adaptive element will produce a robust controller with wider
coverage of uncertainty. By combining QFT-based deadbeat
algorithm and adaptive element, superior controller that is called selftuning
QFT-based deadbeat controller could be achieved. The output
response that is fast, robust and adaptive is expected. Using a grain
dryer plant model as a pilot case-study, the performance of the
proposed method has been evaluated and analyzed. Grain drying
process is very complex with highly nonlinear behaviour, long delay,
affected by environmental changes and affected by disturbances.
Performance comparisons have been performed between the
proposed self-tuning QFT-based deadbeat, standard QFT and
standard dead-beat controllers. The efficiency of the self-tuning QFTbased
dead-beat controller has been proven from the tests results in
terms of controller’s parameters are updated online, less percentage
of overshoot and settling time especially when there are variations in
the plant.
Abstract: Traffic density, an indicator of traffic
conditions, is one of the most critical characteristics to
Intelligent Transport Systems (ITS). This paper investigates
recursive traffic density estimation using the information
provided from inductive loop detectors. On the basis of the
phenomenological relationship between speed and density, the
existing studies incorporate a state space model and update the
density estimate using vehicular speed observations via the
extended Kalman filter, where an approximation is made
because of the linearization of the nonlinear observation
equation. In practice, this may lead to substantial estimation
errors. This paper incorporates a suitable transformation to
deal with the nonlinear observation equation so that the
approximation is avoided when using Kalman filter to
estimate the traffic density. A numerical study is conducted. It
is shown that the developed method outperforms the existing
methods for traffic density estimation.
Abstract: This paper presents the fundamentals of Origami engineering and its application in nowadays as well as future industry. Several main cores of mathematical approaches such as Huzita- Hatori axioms, Maekawa and Kawasaki-s theorems are introduced briefly. Meanwhile flaps and circle packing by Robert Lang is explained to make understood the underlying principles in designing crease pattern. Rigid origami and its corrugation patterns which are potentially applicable for creating transformable or temporary spaces is discussed to show the transition of origami from paper to thick material. Moreover, some innovative applications of origami such as eyeglass, origami stent and high tech origami based on mentioned theories and principles are showcased in section III; while some updated origami technology such as Vacuumatics, self-folding of polymer sheets and programmable matter folding which could greatlyenhance origami structureare demonstrated in Section IV to offer more insight in future origami.
Abstract: Application of Geo-Informatic technology in land
tenure and land use on the economic crop area, to create sustainable
land, access to the area, and produce sustainable food for the demand
of its people in the community. The research objectives are to 1)
apply Geo-Informatic Technology on land ownership and agricultural
land use (cash crops) in the research area, 2) create GIS database on
land ownership and land use, 3) create database of an online Geoinformation
system on land tenure and land use. The results of this
study reveal that, first; the study area is on high slope, mountains and
valleys. The land is mainly in the forest zone which was included in
the Forest Act 1941 and National Conserved Forest 1964. Residents
gained the rights to exploit the land passed down from their
ancestors. The practice was recognized by communities. The land
was suitable for cultivating a wide variety of economic crops that was
the main income of the family. At present the local residents keep
expanding the land to grow cash crops. Second; creating a database
of the geographic information system consisted of the area range,
announcement from the Interior Ministry, interpretation of satellite
images, transportation routes, waterways, plots of land with a title
deed available at the provincial land office. Most pieces of land
without a title deed are located in the forest and national reserve
areas. Data were created from a field study and a land zone
determined by a GPS. Last; an online Geo-Informatic System can
show the information of land tenure and land use of each economic
crop. Satellite data with high resolution which could be updated and
checked on the online Geo-Informatic System simultaneously.
Abstract: Hierarchical Mobile IPv6 (HMIPv6) was designed to
support IP micro-mobility management in the Next Generation
Networks (NGN) framework. The main design behind this protocol is
the usage of Mobility Anchor Point (MAP) located at any level router
of network to support hierarchical mobility management. However,
the distance MAP selection in HMIPv6 causes MAP overloaded and
increase frequent binding update as the network grows. Therefore, to
address the issue in designing MAP selection scheme, we propose a
dynamic load control mechanism integrates with a speed detection
mechanism (DMS-DLC). From the experimental results we obtain
that the proposed scheme gives better distribution in MAP load and
increase handover speed.
Abstract: The majority of existing predictors for time series are
model-dependent and therefore require some prior knowledge for the
identification of complex systems, usually involving system
identification, extensive training, or online adaptation in the case of
time-varying systems. Additionally, since a time series is usually
generated by complex processes such as the stock market or other
chaotic systems, identification, modeling or the online updating of
parameters can be problematic. In this paper a model-free predictor
(MFP) for a time series produced by an unknown nonlinear system or
process is derived using tracking theory. An identical derivation of the
MFP using the property of the Newton form of the interpolating
polynomial is also presented. The MFP is able to accurately predict
future values of a time series, is stable, has few tuning parameters and
is desirable for engineering applications due to its simplicity, fast
prediction speed and extremely low computational load. The
performance of the proposed MFP is demonstrated using the
prediction of the Dow Jones Industrial Average stock index.
Abstract: This research investigates the design of a low-cost 3D
spatial interaction approach using the Wii Remote for immersive
Head-Mounted Display (HMD) virtual reality. Current virtual reality
applications that incorporate the Wii Remote are either desktop
virtual reality applications or systems that use large screen displays.
However, the requirements for an HMD virtual reality system differ
from such systems. This is mainly because in HMD virtual reality,
the display screen does not remain at a fixed location. The user views
the virtual environment through display screens that are in front of
the user-s eyes and when the user moves his/her head, these screens
move as well. This means that the display has to be updated in realtime
based on where the user is currently looking. Normal usage of
the Wii Remote requires the controller to be pointed in a certain
direction, typically towards the display. This is too restrictive for
HMD virtual reality systems that ideally require the user to be able to
turn around in the virtual environment. Previous work proposed a
design to achieve this, however it suffered from a number of
drawbacks. The aim of this study is to look into a suitable method of
using the Wii Remote for 3D interaction in a space around the user
for HMD virtual reality. This paper presents an overview of issues
that had to be considered, the system design as well as experimental
results.
Abstract: This paper has as its main aim to analyse how
corporate web pages can become an essential tool in order to detect
strategic trends by firms or sectors, and even a primary source for
benchmarking. This technique has made it possible to identify the key
issues in the strategic management of the most excellent large Spanish
firms and also to describe trends in their long-range planning, a way of
working that can be generalised to any country or firm group. More
precisely, two objectives were sought. The first one consisted in showing
the way in which corporate websites make it possible to obtain direct
information about the strategic variables which can define firms. This
tool is dynamic (since web pages are constantly updated) as well as
direct and reliable, since the information comes from the firm itself, not
from comments of third parties (such as journalists, academicians,
consultants...). When this information is analysed for a group of firms,
one can observe their characteristics in terms of both managerial tasks
and business management. As for the second objective, the methodology
proposed served to describe the corporate profile of the large Spanish
enterprises included in the Ibex35 (the Ibex35 or Iberia Index is the
reference index in the Spanish Stock Exchange and gathers periodically
the 35 most outstanding Spanish firms). An attempt is therefore made to
define the long-range planning that would be characteristic of the largest
Spanish firms.
Abstract: As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Abstract: The choice of finite element to use in order to predict
nonlinear static or dynamic response of complex structures becomes
an important factor. Then, the main goal of this research work is to
focus a study on the effect of the in-plane rotational degrees of
freedom in linear and geometrically non linear static and dynamic
analysis of thin shell structures by flat shell finite elements. In this
purpose: First, simple triangular and quadrilateral flat shell finite
elements are implemented in an incremental formulation based on the
updated lagrangian corotational description for geometrically
nonlinear analysis. The triangular element is a combination of DKT
and CST elements, while the quadrilateral is a combination of DKQ
and the bilinear quadrilateral membrane element. In both elements,
the sixth degree of freedom is handled via introducing fictitious
stiffness. Secondly, in the same code, the sixth degrees of freedom in
these elements is handled differently where the in-plane rotational
d.o.f is considered as an effective d.o.f in the in-plane filed
interpolation. Our goal is to compare resulting shell elements. Third,
the analysis is enlarged to dynamic linear analysis by direct
integration using Newmark-s implicit method. Finally, the linear
dynamic analysis is extended to geometrically nonlinear dynamic
analysis where Newmark-s method is used to integrate equations of
motion and the Newton-Raphson method is employed for iterating
within each time step increment until equilibrium is achieved. The
obtained results demonstrate the effectiveness and robustness of the
interpolation of the in-plane rotational d.o.f. and present deficiencies
of using fictitious stiffness in dynamic linear and nonlinear analysis.
Abstract: Information Retrieval has the objective of studying
models and the realization of systems allowing a user to find the
relevant documents adapted to his need of information. The
information search is a problem which remains difficult because the
difficulty in the representing and to treat the natural languages such
as polysemia. Intentional Structures promise to be a new paradigm to
extend the existing documents structures and to enhance the different
phases of documents process such as creation, editing, search and
retrieval. The intention recognition of the author-s of texts can reduce
the largeness of this problem. In this article, we present intentions
recognition system is based on a semi-automatic method of
extraction the intentional information starting from a corpus of text.
This system is also able to update the ontology of intentions for the
enrichment of the knowledge base containing all possible intentions
of a domain. This approach uses the construction of a semi-formal
ontology which considered as the conceptualization of the intentional
information contained in a text. An experiments on scientific
publications in the field of computer science was considered to
validate this approach.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Wireless mobile communications have experienced
the phenomenal growth through last decades. The advances in
wireless mobile technologies have brought about a demand for high
quality multimedia applications and services. For such applications
and services to work, signaling protocol is required for establishing,
maintaining and tearing down multimedia sessions. The Session
Initiation Protocol (SIP) is an application layer signaling protocols,
based on request/response transaction model. This paper considers
SIP INVITE transaction over an unreliable medium, since it has been
recently modified in Request for Comments (RFC) 6026. In order to
help in assuring that the functional correctness of this modification is
achieved, the SIP INVITE transaction is modeled and analyzed using
Colored Petri Nets (CPNs). Based on the model analysis, it is
concluded that the SIP INVITE transaction is free of livelocks and
dead codes, and in the same time it has both desirable and
undesirable deadlocks. Therefore, SIP INVITE transaction should be
subjected for additional updates in order to eliminate undesirable
deadlocks. In order to reduce the cost of implementation and
maintenance of SIP, additional remodeling of the SIP INVITE
transaction is recommended.
Abstract: In order to optimize annual IT spending and to reduce
the complexity of an entire system architecture, SOA trials have been
started. It is common knowledge that to design an SOA system we
have to adopt the top-down approach, but in reality silo systems are
being made, so these companies cannot reuse newly designed services,
and cannot enjoy SOA-s economic benefits. To prevent this situation,
we designed a generic SOA development process referred to as the
architecture of “mass customization."
To define the generic detail development processes, we did a case
study on an imaginary company. Through the case study, we could
define the practical development processes and found this could vastly
reduce updating development costs.
Abstract: A number of routing algorithms based on learning
automata technique have been proposed for communication
networks. How ever, there has been little work on the effects of
variation of graph scarcity on the performance of these algorithms. In
this paper, a comprehensive study is launched to investigate the
performance of LASPA, the first learning automata based solution to
the dynamic shortest path routing, across different graph structures
with varying scarcities. The sensitivity of three main performance
parameters of the algorithm, being average number of processed
nodes, scanned edges and average time per update, to variation in
graph scarcity is reported. Simulation results indicate that the LASPA
algorithm can adapt well to the scarcity variation in graph structure
and gives much better outputs than the existing dynamic and fixed
algorithms in terms of performance criteria.