Abstract: Wireless Sensor Networks (WSNs) have wide variety
of applications and provide limitless future potentials. Nodes in
WSNs are prone to failure due to energy depletion, hardware failure,
communication link errors, malicious attacks, and so on. Therefore,
fault tolerance is one of the critical issues in WSNs. We study how
fault tolerance is addressed in different applications of WSNs. Fault
tolerant routing is a critical task for sensor networks operating in
dynamic environments. Many routing, power management, and data
dissemination protocols have been specifically designed for WSNs
where energy awareness is an essential design issue. The focus,
however, has been given to the routing protocols which might differ
depending on the application and network architecture.
Abstract: Flash Floods, together with landslides, are a common
natural threat for people living in mountainous regions and foothills.
One way to deal with this constant menace is the use of Early
Warning Systems, which have become a very important mitigation
strategy for natural disasters.
In this work we present our proposal for a pilot Flash Flood Early
Warning System for Santiago, Chile, the first stage of a more
ambitious project that in a future stage shall also include early
warning of landslides.
To give a context for our approach, we first analyze three existing
Flash Flood Early Warning Systems, focusing on their general
architectures. We then present our proposed system, with main focus
on the decision support system, a system that integrates empirical
models and fuzzy expert systems to achieve reliable risk estimations.
Abstract: Recently, the green architecture becomes a
significant way to a sustainable future. Green building designs
involve finding the balance between comfortable homebuilding and
sustainable environment. Moreover, the utilization of the new
technologies such as artificial intelligence techniques are used to
complement current practices in creating greener structures to keep
the built environment more sustainable. The most common objectives
in green buildings should be designed to minimize the overall impact
of the built environment that effect on ecosystems in general and in
particularly human health and natural environment. This will lead to
protecting occupant health, improving employee productivity,
reducing pollution and sustaining the environmental. In green
building design, multiple parameters which may be interrelated,
contradicting, vague and of qualitative/quantitative nature are
broaden to use. This paper presents a comprehensive critical state- ofart-
review of current practices based on fuzzy and its combination
techniques. Also, presented how green architecture/building can be
improved using the technologies that been used for analysis to seek
optimal green solutions strategies and models to assist in making the
best possible decision out of different alternatives.
Abstract: The present work describes the implementation of the
Enhanced Collaborative Optimization (ECO) multilevel architecture
with a gradient-based optimization algorithm with the aim of
performing a multidisciplinary design optimization of a generic
unmanned aerial vehicle with morphing technologies. The concepts
of weighting coefficient and dynamic compatibility parameter are
presented for the ECO architecture. A routine that calculates the
aircraft performance for the user defined mission profile and vehicle’s
performance requirements has been implemented using low fidelity
models for the aerodynamics, stability, propulsion, weight, balance
and flight performance. A benchmarking case study for evaluating
the advantage of using a variable span wing within the optimization
methodology developed is presented.
Abstract: In this paper, the problem of fault detection and
isolation in the attitude control subsystem of spacecraft formation
flying is considered. In order to design the fault detection method, an
extended Kalman filter is utilized which is a nonlinear stochastic state
estimation method. Three fault detection architectures, namely,
centralized, decentralized, and semi-decentralized are designed based
on the extended Kalman filters. Moreover, the residual generation
and threshold selection techniques are proposed for these
architectures.
Abstract: A Disaster Management System (DMS) is very important for countries with multiple disasters, such as Chile. In the world (also in Chile)different disasters (earthquakes, tsunamis, volcanic eruption, fire or other natural or man-made disasters) happen and have an effect on the population. It is also possible that two or more disasters occur at the same time. This meansthata multi-risk situation must be mastered. To handle such a situation a Decision Support System (DSS) based on multiagents is a suitable architecture. The most known DMSs are concernedwith only a singledisaster (sometimes thecombination of earthquake and tsunami) and often with a particular disaster. Nevertheless, a DSS helps for a better real-time response. Analyze the existing systems in the literature and expand them for multi-risk disasters to construct a well-organized system is the proposal of our work. The here shown work is an approach of a multi-risk system, which needs an architecture and well defined aims. In this moment our study is a kind of case study to analyze the way we have to follow to create our proposed system in the future.
Abstract: The Cone Penetration Test (CPT) is a common in-situ
test which generally investigates a much greater volume of soil more
quickly than possible from sampling and laboratory tests. Therefore,
it has the potential to realize both cost savings and assessment of soil
properties rapidly and continuously. The principle objective of this
paper is to demonstrate the feasibility and efficiency of using
artificial neural networks (ANNs) to predict the soil angle of internal
friction (Φ) and the soil modulus of elasticity (E) from CPT results
considering the uncertainties and non-linearities of the soil. In
addition, ANNs are used to study the influence of different
parameters and recommend which parameters should be included as
input parameters to improve the prediction. Neural networks discover
relationships in the input data sets through the iterative presentation
of the data and intrinsic mapping characteristics of neural topologies.
General Regression Neural Network (GRNN) is one of the powerful
neural network architectures which is utilized in this study. A large
amount of field and experimental data including CPT results, plate
load tests, direct shear box, grain size distribution and calculated data
of overburden pressure was obtained from a large project in the
United Arab Emirates. This data was used for the training and the
validation of the neural network. A comparison was made between
the obtained results from the ANN's approach, and some common
traditional correlations that predict Φ and E from CPT results with
respect to the actual results of the collected data. The results show
that the ANN is a very powerful tool. Very good agreement was
obtained between estimated results from ANN and actual measured
results with comparison to other correlations available in the
literature. The study recommends some easily available parameters
that should be included in the estimation of the soil properties to
improve the prediction models. It is shown that the use of friction
ration in the estimation of Φ and the use of fines content in the
estimation of E considerable improve the prediction models.
Abstract: The practice of freeing monuments from subsequent
additions crosses the entire history of conservation and it is
traditionally connected to the aim of valorisation, both for cultural
and educational purpose and recently even for touristic exploitation.
Defence heritage has been widely interested by these cultural and
technical moods from philological restoration to critic innovations. A
renovated critical analysis of Italian episodes and in particular the
Sardinian case of the area of San Pancrazio in Cagliari, constitute an
important lesson about the limits of this practice and the uncertainty
in terms of results, towards the definition of a sustainable good
practice in the restoration of military architectures.
Abstract: A model reference adaptive control and a fixed gain
LQR control were implemented in the height controller of a quadrotor
that has parametric uncertainties due to the act of picking up an
object of unknown dimension and mass. It is shown that an adaptive
controller, unlike the fixed gain controller, is capable of ensuring a
stable tracking performance under such condition, although adaptive
control suffers from several limitations. The combination of both
adaptive and fixed gain control in the controller architecture can
result in an enhanced tracking performance in the presence parametric
uncertainties.
Abstract: Reverse Logistics (RL) Network is considered as
complex and dynamic network that involves many stakeholders such
as: suppliers, manufactures, warehouse, retails and costumers, this
complexity is inherent in such process due to lack of perfect
knowledge or conflicting information. Ontologies on the other hand
can be considered as an approach to overcome the problem of sharing
knowledge and communication among the various reverse logistics
partners. In this paper we propose a semantic representation based on
hybrid architecture for building the Ontologies in ascendant way, this
method facilitates the semantic reconciliation between the
heterogeneous information systems that support reverse logistics
processes and product data.
Abstract: Wireless mesh networking is rapidly gaining in
popularity with a variety of users: from municipalities to enterprises,
from telecom service providers to public safety and military
organizations. This increasing popularity is based on two basic facts:
ease of deployment and increase in network capacity expressed in
bandwidth per footage; WMNs do not rely on any fixed
infrastructure. Many efforts have been used to maximizing
throughput of the network in a multi-channel multi-radio wireless
mesh network. Current approaches are purely based on either static or
dynamic channel allocation approaches. In this paper, we use a
hybrid multichannel multi radio wireless mesh networking
architecture, where static and dynamic interfaces are built in the
nodes. Dynamic Adaptive Channel Allocation protocol (DACA), it
considers optimization for both throughput and delay in the channel
allocation. The assignment of the channel has been allocated to be codependent
with the routing problem in the wireless mesh network and
that should be based on passage flow on every link. Temporal and
spatial relationship rises to re compute the channel assignment every
time when the pattern changes in mesh network, channel assignment
algorithms assign channels in network. In this paper a computing
path which captures the available path bandwidth is the proposed
information and the proficient routing protocol based on the new path
which provides both static and dynamic links. The consistency
property guarantees that each node makes an appropriate packet
forwarding decision and balancing the control usage of the network,
so that a data packet will traverse through the right path.
Abstract: The Simulation based VLSI Implementation of
FELICS (Fast Efficient Lossless Image Compression System)
Algorithm is proposed to provide the lossless image compression and
is implemented in simulation oriented VLSI (Very Large Scale
Integrated). To analysis the performance of Lossless image
compression and to reduce the image without losing image quality
and then implemented in VLSI based FELICS algorithm. In FELICS
algorithm, which consists of simplified adjusted binary code for
Image compression and these compression image is converted in
pixel and then implemented in VLSI domain. This parameter is used
to achieve high processing speed and minimize the area and power.
The simplified adjusted binary code reduces the number of arithmetic
operation and achieved high processing speed. The color difference
preprocessing is also proposed to improve coding efficiency with
simple arithmetic operation. Although VLSI based FELICS
Algorithm provides effective solution for hardware architecture
design for regular pipelining data flow parallelism with four stages.
With two level parallelisms, consecutive pixels can be classified into
even and odd samples and the individual hardware engine is
dedicated for each one. This method can be further enhanced by
multilevel parallelisms.
Abstract: Securing the confidential data transferred via wireless
network remains a challenging problem. It is paramount to ensure
that data are accessible only by the legitimate users rather than by the
attackers. One of the most serious threats to organization is jamming,
which disrupts the communication between any two pairs of nodes.
Therefore, designing an attack-defending scheme without any packet
loss in data transmission is an important challenge. In this paper,
Dependence based Malicious Route Defending DMRD Scheme has
been proposed in multi path routing environment to prevent jamming
attack. The key idea is to defend the malicious route to ensure
perspicuous transmission. This scheme develops a two layered
architecture and it operates in two different steps. In the first step,
possible routes are captured and their agent dependence values are
marked using triple agents. In the second step, the dependence values
are compared by performing comparator filtering to detect malicious
route as well as to identify a reliable route for secured data
transmission. By simulation studies, it is observed that the proposed
scheme significantly identifies malicious route by attaining lower
delay time and route discovery time; it also achieves higher
throughput.
Abstract: The focal aspire of e-Government (eGovt) is to offer
citizen-centered service delivery. Accordingly, the citizenry
consumes services from multiple government agencies through
national portal. Thus, eGovt is an enterprise with the primary
business motive of transparent, efficient and effective public services
to its citizenry and its logical structure is the eGovernment Enterprise
Architecture (eGEA). Since eGovt is IT oriented multifaceted
service-centric system, EA doesn’t do much on an automated
enterprise other than the business artifacts. Service-Oriented
Architecture (SOA) manifestation led some governments to pertain
this in their eGovts, but it limits the source of business artifacts. The
concurrent use of EA and SOA in eGovt executes interoperability and
integration and leads to Service-Oriented e-Government Enterprise
(SOeGE). Consequently, agile eGovt system becomes a reality. As an
IT perspective eGovt comprises of centralized public service artifacts
with the existing application logics belong to various departments at
central, state and local level. The eGovt is renovating to SOeGE by
apply the Service-Orientation (SO) principles in the entire system.
This paper explores IT perspective of SOeGE in India which
encompasses the public service models and illustrated with a case
study the Passport service of India.
Abstract: Software Architecture is the basic structure of
software that states the development and advancement of a software
system. Software architecture is also considered as a significant tool
for the construction of high quality software systems. A clean design
leads to the control, value and beauty of software resulting in its
longer life while a bad design is the cause of architectural erosion
where a software evolution completely fails. This paper discusses the
occurrence of software architecture erosion and presents a set of
methods for the detection, declaration and prevention of architecture
erosion. The causes and symptoms of architecture erosion are
observed with the examples of prescriptive and descriptive
architectures and the practices used to stop this erosion are also
discussed by considering different types of software erosion and their
affects. Consequently finding and devising the most suitable
approach for fighting software architecture erosion and in some way
reducing its affect is evaluated and tested on different scenarios.
Abstract: The final step to complete the “Analytical Systems
Engineering Process” is the “Allocated Architecture” in which all
Functional Requirements (FRs) of an engineering system must be
allocated into their corresponding Physical Components (PCs). At
this step, any design for developing the system’s allocated
architecture in which no clear pattern of assigning the exclusive
“responsibility” of each PC for fulfilling the allocated FR(s) can be
found is considered a poor design that may cause difficulties in
determining the specific PC(s) which has (have) failed to satisfy a
given FR successfully. The present study utilizes the Axiomatic
Design method principles to mathematically address this problem and
establishes an “Axiomatic Model” as a solution for reaching good
alternatives for developing the allocated architecture. This study
proposes a “loss Function”, as a quantitative criterion to monetarily
compare non-ideal designs for developing the allocated architecture
and choose the one which imposes relatively lower cost to the
system’s stakeholders. For the case-study, we use the existing design
of U. S. electricity marketing subsystem, based on data provided by
the U.S. Energy Information Administration (EIA). The result for
2012 shows the symptoms of a poor design and ineffectiveness due to
coupling among the FRs of this subsystem.
Abstract: Enterprise Architecture (EA) is employed by
enterprises for providing integrated Information Systems (ISs) in
order to support alignment of their business and Information
Technology (IT). Evaluation of EA implementation can support
enterprise to reach intended goals. There are some problems in
current evaluation methods of EA implementation that lead to
ineffectiveness implementation of EA. This paper represents current
issues on evaluation of EA implementation. In this regard, we set the
framework in order to represent evaluation’s issues based on their
functionality and structure. The results of this research not only
increase the knowledge of evaluation, but also could be useful for
both academics and practitioners in order to realize the current
situation of evaluations.
Abstract: Enterprise Architecture (EA) is a strategy that is
employed by enterprises in order to align their business and
Information Technology (IT). EA is managed, developed, and
maintained through Enterprise Architecture Implementation
Methodology (EAIM). Effectiveness of EA implementation is the
degree in which EA helps to achieve the collective goals of the
organization. This paper analyzes the results of a survey that aims to
explore the factors that affect the effectiveness of EAIM and
specifically the relationship between factors and effectiveness of the
output and functionality of EA project. The exploratory factor
analysis highlights a specific set of five factors: alignment,
adaptiveness, support, binding, and innovation. The regression
analysis shows that there is a statistically significant and positive
relationship between each of the five factors and the effectiveness of
EAIM. Consistent with theory and practice, the most prominent
factor for developing an effective EAIM is innovation. The findings
contribute to the measuring the effectiveness of EA implementation
project by providing an indication of the measurement
implementation approaches which is used by the Enterprise
Architects, and developing an effective EAIM.
Abstract: Enterprise Architecture (EA) Implementation
Methodologies have become an important part of EA projects.
Several implementation methodologies have been proposed, as a
theoretical and practical approach, to facilitate and support the
development of EA within an enterprise. A significant question when
facing the starting of EA implementation is deciding which
methodology to utilize. In order to answer this question, a framework
with several criteria is applied in this paper for the comparative
analysis of existing EA implementation methodologies. Five EA
implementation methodologies including: EAP, TOGAF, DODAF,
Gartner, and FEA are selected in order to compare with proposed
framework. The results of the comparison indicate that those
methodologies have not reached a sufficient maturity as whole due to
lack of consideration on requirement management, maintenance,
continuum, and complexities in their process. The framework has
also ability for the evaluation of any kind of EA implementation
methodologies.
Abstract: In the cloud computing hierarchy IaaS is the lowest
layer, all other layers are built over it. Thus it is the most important
layer of cloud and requisite more importance. Along with advantages
IaaS faces some serious security related issue. Mainly Security
focuses on Integrity, confidentiality and availability. Cloud
computing facilitate to share the resources inside as well as outside of
the cloud. On the other hand, cloud still not in the state to provide
surety to 100% data security. Cloud provider must ensure that end
user/client get a Quality of Service. In this report we describe
possible aspects of cloud related security.