Abstract: The organizations of European and Czech critical
infrastructure have specific position, mission, characteristics and
behaviour in European Union and Czech state/business environments,
regarding specific requirements for regional and global security
environments. They must respect policy of national security and
global rules, requirements and standards in all their inherent and
outer processes of supply - customer chains and networks. A
controlling is generalized capability to have control over situational
policy. This paper aims and purposes are to introduce the controlling
as quite new necessary process attribute providing for critical
infrastructure is environment the capability and profit to achieve its
commitment regarding to the effectiveness of the quality
management system in meeting customer/ user requirements and also
the continual improvement of critical infrastructure organization’s
processes overall performance and efficiency, as well as its societal
security via continual planning improvement via DYVELOP
modelling.
Abstract: Different order modulations combined with different
coding schemes, allow sending more bits per symbol, thus achieving
higher throughputs and better spectral efficiencies. However, it must
also be noted that when using a modulation technique such as 64-
QAM with less overhead bits, better signal-to-noise ratios (SNRs) are
needed to overcome any Inter symbol Interference (ISI) and maintain
a certain bit error ratio (BER). The use of adaptive modulation allows
wireless technologies to yielding higher throughputs while also
covering long distances. The aim of this paper is to implement an
Adaptive Modulation and Coding (AMC) features of the WiMAX
PHY in MATLAB and to analyze the performance of the system in
different channel conditions (AWGN, Rayleigh and Rician fading
channel) with channel estimation and blind equalization. Simulation
results have demonstrated that the increment in modulation order
causes to increment in throughput and BER values. These results
derived a trade-off among modulation order, FFT length, throughput,
BER value and spectral efficiency. The BER changes gradually for
AWGN channel and arbitrarily for Rayleigh and Rician fade
channels.
Abstract: Recent advances in wireless networking technologies
introduce several energy aware routing protocols in sensor networks.
Such protocols aim to extend the lifetime of network by reducing the
energy consumption of nodes. Many researchers are looking for
certain challenges that are predominant in the grounds of energy
consumption. One such protocol that addresses this energy
consumption issue is ‘Cluster based hierarchical routing protocol’. In
this paper, we intend to discuss some of the major hierarchical
routing protocols adhering towards sensor networks. Furthermore, we
examine and compare several aspects and characteristics of few
widely explored hierarchical clustering protocols, and its operations
in wireless sensor networks (WSN). This paper also presents a
discussion on the future research topics and the challenges of
hierarchical clustering in WSNs.
Abstract: Different strategies and tools are available at the oil
and gas industry for detecting and analyzing tension and possible
fractures in borehole walls. Most of these techniques are based on
manual observation of the captured borehole images. While this
strategy may be possible and convenient with small images and few
data, it may become difficult and suitable to errors when big
databases of images must be treated. While the patterns may differ
among the image area, depending on many characteristics (drilling
strategy, rock components, rock strength, etc.). In this work we
propose the inclusion of data-mining classification strategies in order
to create a knowledge database of the segmented curves. These
classifiers allow that, after some time using and manually pointing
parts of borehole images that correspond to tension regions and
breakout areas, the system will indicate and suggest automatically
new candidate regions, with higher accuracy. We suggest the use of
different classifiers methods, in order to achieve different knowledge
dataset configurations.
Abstract: A relationship between face and signature biometrics
is established in this paper. A new approach is developed to predict
faces from signatures by using artificial intelligence. A multilayer
perceptron (MLP) neural network is used to generate face details
from features extracted from signatures, here face is the physical
biometric and signatures is the behavioural biometric. The new
method establishes a relationship between the two biometrics and
regenerates a visible face image from the signature features.
Furthermore, the performance efficiencies of our new technique are
demonstrated in terms of minimum error rates compared to published
work.
Abstract: Human movement in the real world provides
important information for developing human behaviour models and
simulations. However, it is difficult to assess ‘real’ human behaviour
since there is no established method available. As part of the AUNTSUE
(Accessibility and User Needs in Transport – Sustainable Urban
Environments) project, this research aimed to propose a method to
assess human movement and behaviour in crowded areas. The
method is based on the three major steps of video recording,
conceptual behavior modelling and video analysis. The focus is on
individual human movement and behaviour in normal situations
(panic situations are not considered) and the interactions between
individuals in localized areas. Emphasis is placed on gaining
knowledge of characteristics of human movement and behaviour in
the real world that can be modelled in the virtual environment.
Abstract: This paper presents a real-time visualization technique
and filtering of classified LiDAR point clouds. The visualization is
capable of displaying filtered information organized in layers by the
classification attribute saved within LiDAR datasets. We explain the
used data structure and data management, which enables real-time
presentation of layered LiDAR data. Real-time visualization is
achieved with LOD optimization based on the distance from the
observer without loss of quality. The filtering process is done in two
steps and is entirely executed on the GPU and implemented using
programmable shaders.
Abstract: Natural gas, as one of the most important sources of
energy for many of the industrial and domestic users all over the
world, has a complex, huge supply chain which is in need of heavy
investments in all the phases of exploration, extraction, production,
transportation, storage and distribution. The main purpose of supply
chain is to meet customers’ need efficiently and with minimum cost.
In this study, with the aim of minimizing economic costs, different
levels of natural gas supply chain in the form of a multi-echelon,
multi-period fuzzy linear programming have been modeled. In this
model, different constraints including constraints on demand
satisfaction, capacity, input/output balance and presence/absence of a
path have been defined. The obtained results suggest efficiency of the
recommended model in optimal allocation and reduction of supply
chain costs.
Abstract: The aim of this paper is to propose a novel technique
to guarantee Quality of Service (QoS) in a highly dynamic
environment. The MANET changes its topology dynamically as the
nodes are moved frequently. This will cause link failure between
mobile nodes. MANET cannot ensure reliability without delay. The
relay node is selected based on achieving QoS in previous
transmission. It considers one more factor Connection Existence
Period (CEP) to ensure reliability. CEP is to find out the period
during that connection exists between the nodes. The node with
highest CEP becomes a next relay node. The relay node is selected
dynamically to avoid frequent failure. The bandwidth of each link
changed dynamically based on service rate and request rate. This
paper proposes Active bandwidth setting up algorithm to guarantee
the QoS. The series of results obtained by using the Network
Simulator (NS-2) demonstrate the viability of our proposed
techniques.
Abstract: There are several methods to monitor software
projects and the objective for monitoring is to ensure that the
software projects are developed and delivered successfully. A
performance measurement is a method that is closely associated with
monitoring and it can be scrutinized by looking at two important
attributes which are efficiency and effectiveness both of which are
factors that are important for the success of a software project.
Consequently, a successful steering is achieved by monitoring and
controlling a software project via the performance measurement
criteria and metrics. Hence, this paper is aimed at identifying the
performance measurement criteria and the metrics for monitoring the
performance of a software project by using the Goal Question
Metrics (GQM) approach. The GQM approach is utilized to ensure
that the identified metrics are reliable and useful. These identified
metrics are useful guidelines for project managers to monitor the
performance of their software projects.
Abstract: Most quality models have defined usability as a
significant factor that leads to improving product acceptability,
increasing user satisfaction, improving product reliability, and also
financially benefitting companies. Usability is also the best factor that
balances both the technical and human aspects of a software product,
which is an important aspect in defining quality during software
development process. A usability risk consist risk factors that could
impact the usability of a software product thereby contributing to
negative user experiences and causing a possible software product
failure. Hence, it is important to mitigate and reduce usability risks in
the software development process itself. By managing possible
usability risks in software development process, failure of software
product could be reduced. Therefore, this research uses the Delphi
method to identify mitigation plans for reducing potential usability
risks. The Delphi method is conducted with seven experts from the
field of risk management and software development.
Abstract: Cloud service brokering is a new service paradigm that
provides interoperability and portability of application across multiple
Cloud providers. In this paper, we designed Cloud service brokerage
system, anyBroker, supporting integrated service provisioning and
SLA based service lifecycle management. For the system design, we
introduce the system concept and whole architecture, details of main
components and use cases of primary operations in the system. These
features ease the Cloud service provider and customer’s concern and
support new Cloud service open market to increase Cloud service
profit and prompt Cloud service echo system in Cloud computing
related area.
Abstract: In this paper, we have proposed a parallel IDS and
honeypot based approach to detect and analyze the unknown and
known attack taxonomy for improving the IDS performance and
protecting the network from intruders. The main theme of our
approach is to record and analyze the intruder activities by using both
the low and high interaction honeypots. Our architecture aims to
achieve the required goals by combing signature based IDS,
honeypots and generate the new signatures. The paper describes the
basic component, design and implementation of this approach and
also demonstrates the effectiveness of this approach to reduce the
probability of network attacks.
Abstract: Due to the large amount of information in the World
Wide Web (WWW, web) and the lengthy and usually linearly
ordered result lists of web search engines that do not indicate
semantic relationships between their entries, the search for topically
similar and related documents can become a tedious task. Especially,
the process of formulating queries with proper terms representing
specific information needs requires much effort from the user. This
problem gets even bigger when the user's knowledge on a subject and
its technical terms is not sufficient enough to do so. This article
presents the new and interactive search application DocAnalyser that
addresses this problem by enabling users to find similar and related
web documents based on automatic query formulation and state-ofthe-
art search word extraction. Additionally, this tool can be used to
track topics across semantically connected web documents.
Abstract: This paper deals with the problem of automatic rule
generation for fuzzy systems design. The proposed approach is based
on hybrid artificial bee colony (ABC) optimization and weighted least
squares (LS) method and aims to find the structure and parameters of
fuzzy systems simultaneously. More precisely, two ABC based fuzzy
modeling strategies are presented and compared. The first strategy
uses global optimization to learn fuzzy models, the second one
hybridizes ABC and weighted least squares estimate method. The
performances of the proposed ABC and ABC-LS fuzzy modeling
strategies are evaluated on complex modeling problems and compared
to other advanced modeling methods.
Abstract: The handwriting is a physical demonstration of a
complex cognitive process learnt by man since his childhood. People
with disabilities or suffering from various neurological diseases are
facing so many difficulties resulting from problems located at the
muscle stimuli (EMG) or signals from the brain (EEG) and which
arise at the stage of writing. The handwriting velocity of the same
writer or different writers varies according to different criteria: age,
attitude, mood, writing surface, etc. Therefore, it is interesting to
reconstruct an experimental basis records taking, as primary
reference, the writing speed for different writers which would allow
studying the global system during handwriting process. This paper
deals with a new approach of the handwriting system modeling based
on the velocity criterion through the concepts of artificial neural
networks, precisely the Radial Basis Functions (RBF) neural
networks. The obtained simulation results show a satisfactory
agreement between responses of the developed neural model and the
experimental data for various letters and forms then the efficiency of
the proposed approaches.
Abstract: Through this paper we present a method for automatic
generation of ontological model from any data source using Model
Driven Architecture (MDA), this generation is dedicated to the
cooperation of the knowledge engineering and software engineering.
Indeed, reverse engineering of a data source generates a software
model (schema of data) that will undergo transformations to generate
the ontological model. This method uses the meta-models to validate
software and ontological models.
Abstract: In this paper, we propose a new packing strategy to
find a free resource for run-time mapping of application tasks to
NoC-based Heterogeneous MPSoC. The proposed strategy minimizes
the task mapping time in addition to placing the communicating tasks
close to each other. To evaluate our approach, a comparative study is
carried out for a platform containing single task supported PEs.
Experiments show that our strategy provides better results when
compared to latest dynamic mapping strategies reported in the
literature.
Abstract: Medical imaging produces human body pictures in
digital form. Since these imaging techniques produce prohibitive
amounts of data, compression is necessary for storage and
communication purposes. Many current compression schemes
provide a very high compression rate but with considerable loss of
quality. On the other hand, in some areas in medicine, it may be
sufficient to maintain high image quality only in region of interest
(ROI). This paper discusses a contribution to the lossless
compression in the region of interest of Scintigraphic images based
on SPIHT algorithm and global transform thresholding using
Huffman coding.
Abstract: Assertion-Based software testing has been shown to
be a promising tool for generating test cases that reveal program
faults. Because the number of assertions may be very large for
industry-size programs, one of the main concerns to the applicability
of assertion-based testing is the amount of search time required to
explore a large number of assertions. This paper presents a new
approach for assertions exploration during the process of Assertion-
Based software testing. Our initial exterminations with the proposed
approach show that the performance of Assertion-Based testing may
be improved, therefore, making this approach more efficient when
applied on programs with large number of assertions.