Abstract: Wireless sensor network can be applied to both abominable
and military environments. A primary goal in the design of
wireless sensor networks is lifetime maximization, constrained by
the energy capacity of batteries. One well-known method to reduce
energy consumption in such networks is data aggregation. Providing
efcient data aggregation while preserving data privacy is a challenging
problem in wireless sensor networks research. In this paper,
we present privacy-preserving data aggregation scheme for additive
aggregation functions. The Cluster-based Private Data Aggregation
(CPDA)leverages clustering protocol and algebraic properties of
polynomials. It has the advantage of incurring less communication
overhead. The goal of our work is to bridge the gap between
collaborative data collection by wireless sensor networks and data
privacy. We present simulation results of our schemes and compare
their performance to a typical data aggregation scheme TAG, where
no data privacy protection is provided. Results show the efficacy and
efficiency of our schemes.
Abstract: In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.
Abstract: A SCADA (Supervisory Control And Data
Acquisition) system is an industrial control and monitoring system for
national infrastructures. The SCADA systems were used in a closed
environment without considering about security functionality in the
past. As communication technology develops, they try to connect the
SCADA systems to an open network. Therefore, the security of the
SCADA systems has been an issue. The study of key management for
SCADA system also has been performed. However, existing key
management schemes for SCADA system such as SKE(Key
establishment for SCADA systems) and SKMA(Key management
scheme for SCADA systems) cannot support broadcasting
communication. To solve this problem, an Advanced Key
Management Architecture for Secure SCADA Communication has
been proposed by Choi et al.. Choi et al.-s scheme also has a problem
that it requires lots of computational cost for multicasting
communication. In this paper, we propose an enhanced scheme which
improving computational cost for multicasting communication with
considering the number of keys to be stored in a low power
communication device (RTU).
Abstract: In this study, an analysis has been performed for
conjugate heat and mass transfer of a steady laminar boundary-layer
mixed convection of magnetic hydrodynamic (MHD) flow with
radiation effect of second grade subject to suction past a stretching
sheet. Parameters E Nr, Gr, Gc, Ec and Sc represent the dominance of
the viscoelastic fluid heat and mass transfer effect which have
presented in governing equations, respectively. The similar
transformation and the finite-difference method have been used to
analyze the present problem. The conjugate heat and mass transfer
results show that the non-Newtonian viscoelastic fluid has a better heat
transfer effect than the Newtonian fluid. The free convection with a
larger r G or c G has a good heat transfer effect better than a smaller
r G or c G , and the radiative convection has a good heat transfer
effect better than non-radiative convection.
Abstract: Estimating the reliability of a computer network has been a subject of great interest. It is a well known fact that this problem is NP-hard. In this paper we present a very efficient combinatorial approach for Monte Carlo reliability estimation of a network with unreliable nodes and unreliable edges. Its core is the computation of some network combinatorial invariants. These invariants, once computed, directly provide pure and simple framework for computation of network reliability. As a specific case of this approach we obtain tight lower and upper bounds for distributed network reliability (the so called residual connectedness reliability). We also present some simulation results.
Abstract: The expectation of network performance from the
early days of ARPANET until now has been changed significantly.
Every day, new advancement in technological infrastructure opens
the doors for better quality of service and accordingly level of
perceived quality of network services have been increased over the
time. Nowadays for many applications, late information has no value
or even may result in financial or catastrophic loss, on the other hand,
demands for some level of guarantee in providing and maintaining
quality of service are ever increasing. Based on this history, having a
QoS aware routing system which is able to provide today's required
level of quality of service in the networks and effectively adapt to the
future needs, seems as a key requirement for future Internet. In this
work we have extended the traditional AntNet routing system to
support QoS with multiple metrics such as bandwidth and delay
which is named Q-Net. This novel scalable QoS routing system aims
to provide different types of services in the network simultaneously.
Each type of service can be provided for a period of time in the
network and network nodes do not need to have any previous
knowledge about it. When a type of quality of service is requested,
Q-Net will allocate required resources for the service and will
guarantee QoS requirement of the service, based on target objectives.
Abstract: In this paper is being described a possible use of
virtualization technology in teaching computer networks. The
virtualization can be used as a suitable tool for creating virtual
network laboratories, supplementing the real laboratories and
network simulation software in teaching networking concepts. It will
be given a short description of characteristic projects in the area of
virtualization technology usage in network simulation, network
experiments and engineering education. A method for implementing
laboratory has also been explained, together with possible laboratory
usage and design of laboratory exercises. At the end, the laboratory
test results of virtual laboratory are presented as well.
Abstract: In this paper we present a GP-based method for automatically evolve projections, so that data can be more easily classified in the projected spaces. At the same time, our approach can reduce dimensionality by constructing more relevant attributes. Fitness of each projection measures how easy is to classify the dataset after applying the projection. This is quickly computed by a Simple Linear Perceptron. We have tested our approach in three domains. The experiments show that it obtains good results, compared to other Machine Learning approaches, while reducing dimensionality in many cases.
Abstract: This paper presents an efficient approach to feeder
reconfiguration for power loss reduction and voltage profile
imprvement in unbalanced radial distribution systems (URDS). In
this paper Genetic Algorithm (GA) is used to obtain solution for
reconfiguration of radial distribution systems to minimize the losses.
A forward and backward algorithm is used to calculate load flows in
unbalanced distribution systems. By simulating the survival of the
fittest among the strings, the optimum string is searched by
randomized information exchange between strings by performing
crossover and mutation. Results have shown that proposed algorithm
has advantages over previous algorithms The proposed method is
effectively tested on 19 node and 25 node unbalanced radial
distribution systems.
Abstract: Load balancing in distributed computer systems is the
process of redistributing the work load among processors in the
system to improve system performance. Most of previous research in
using fuzzy logic for the purpose of load balancing has only
concentrated in utilizing fuzzy logic concepts in describing
processors load and tasks execution length. The responsibility of the
fuzzy-based load balancing process itself, however, has not been
discussed and in most reported work is assumed to be performed in a
distributed fashion by all nodes in the network. This paper proposes a
new fuzzy dynamic load balancing algorithm for homogenous
distributed systems. The proposed algorithm utilizes fuzzy logic in
dealing with inaccurate load information, making load distribution
decisions, and maintaining overall system stability. In terms of
control, we propose a new approach that specifies how, when, and by
which node the load balancing is implemented. Our approach is
called Centralized-But-Distributed (CBD).
Abstract: This article deals with the popularity of candidates for the president of the United States of America. The popularity is assessed according to public comments on the Web 2.0. Social networking, blogging and online forums (collectively Web 2.0) are for common Internet users the easiest way to share their personal opinions, thoughts, and ideas with the entire world. However, the web content diversity, variety of technologies and website structure differences, all of these make the Web 2.0 a network of heterogeneous data, where things are difficult to find for common users. The introductory part of the article describes methodology for gathering and processing data from Web 2.0. The next part of the article is focused on the evaluation and content analysis of obtained information, which write about presidential candidates.
Abstract: For future Broad band ISDN, Asynchronous Transfer
Mode (ATM) is designed not only to support a wide range of traffic
classes with diverse flow characteristics, but also to guarantee the
different quality of service QOS requirements. The QOS may be
measured in terms of cell loss probability and maximum cell delay.
In this paper, ATM networks in which the virtual path (VP)
concept is implemented are considered. By applying the Markov
Deterministic process method, an efficient algorithm to compute the
minimum capacity required to satisfy the QOS requirements when
multiple classes of on-off are multiplexed on to a single VP. Using
the result, we then proposed a simple algorithm to determine different
combinations of VP to achieve the optimum of the total capacity
required for satisfying the individual QOS requirements (loss- delay).
Abstract: An early and accurate detection of Alzheimer's disease (AD) is an important stage in the treatment of individuals suffering from AD. We present an approach based on the use of structural magnetic resonance imaging (sMRI) phase images to distinguish between normal controls (NC), mild cognitive impairment (MCI) and AD patients with clinical dementia rating (CDR) of 1. Independent component analysis (ICA) technique is used for extracting useful features which form the inputs to the support vector machines (SVM), K nearest neighbour (kNN) and multilayer artificial neural network (ANN) classifiers to discriminate between the three classes. The obtained results are encouraging in terms of classification accuracy and effectively ascertain the usefulness of phase images for the classification of different stages of Alzheimer-s disease.
Abstract: The growth of open networks created the interest to
commercialise it. The establishment of an electronic business
mechanism must be accompanied by a digital – electronic payment
system to transfer the value of transactions. Financial organizations
are requested to offer a secure e-payment synthesis with equivalent
level of security served in conventional paper-based payment
transactions. PKI, which is functioning as a chain of trust in security
architecture, can enable security services of cryptography to epayments,
in order to take advantage of the wider base either of
customer or of trading partners and the reduction of cost transaction
achieved by the use of Internet channels. The paper addresses the
possibilities and the implementation suggestions of PKI in relevance
to electronic payments by suggesting a framework that should be
followed.
Abstract: One of the major parts of a jet engine is air intake,
which provides proper and required amount of air for the engine to
operate. There are several aerodynamic parameters which should be
considered in design, such as distortion, pressure recovery, etc. In
this research, the effects of lip ice accretion on pitot intake
performance are investigated. For ice accretion phenomenon, two
supervised multilayer neural networks (ANN) are designed, one for
ice shape prediction and another one for ice roughness estimation
based on experimental data. The Fourier coefficients of transformed
ice shape and parameters include velocity, liquid water content
(LWC), median volumetric diameter (MVD), spray time and
temperature are used in neural network training. Then, the subsonic
intake flow field is simulated numerically using 2D Navier-Stokes
equations and Finite Volume approach with Hybrid mesh includes
structured and unstructured meshes. The results are obtained in
different angles of attack and the variations of intake aerodynamic
parameters due to icing phenomenon are discussed. The results show
noticeable effects of ice accretion phenomenon on intake behavior.
Abstract: This paper describes the use of artificial neural
networks (ANN) for predicting non-linear layer moduli of flexible
airfield pavements subjected to new generation aircraft (NGA)
loading, based on the deflection profiles obtained from Heavy
Weight Deflectometer (HWD) test data. The HWD test is one of the
most widely used tests for routinely assessing the structural integrity
of airport pavements in a non-destructive manner. The elastic moduli
of the individual pavement layers backcalculated from the HWD
deflection profiles are effective indicators of layer condition and are
used for estimating the pavement remaining life. HWD tests were
periodically conducted at the Federal Aviation Administration-s
(FAA-s) National Airport Pavement Test Facility (NAPTF) to
monitor the effect of Boeing 777 (B777) and Beoing 747 (B747) test
gear trafficking on the structural condition of flexible pavement
sections. In this study, a multi-layer, feed-forward network which
uses an error-backpropagation algorithm was trained to approximate
the HWD backcalculation function. The synthetic database generated
using an advanced non-linear pavement finite-element program was
used to train the ANN to overcome the limitations associated with
conventional pavement moduli backcalculation. The changes in
ANN-based backcalculated pavement moduli with trafficking were
used to compare the relative severity effects of the aircraft landing
gears on the NAPTF test pavements.
Abstract: For a spatiotemporal database management system,
I/O cost of queries and other operations is an important performance
criterion. In order to optimize this cost, an intense research on
designing robust index structures has been done in the past decade.
With these major considerations, there are still other design issues
that deserve addressing due to their direct impact on the I/O cost.
Having said this, an efficient buffer management strategy plays a key
role on reducing redundant disk access. In this paper, we proposed an
efficient buffer strategy for a spatiotemporal database index
structure, specifically indexing objects moving over a network of
roads. The proposed strategy, namely MONPAR, is based on the data
type (i.e. spatiotemporal data) and the structure of the index
structure. For the purpose of an experimental evaluation, we set up a
simulation environment that counts the number of disk accesses
while executing a number of spatiotemporal range-queries over the
index. We reiterated simulations with query sets with different
distributions, such as uniform query distribution and skewed query
distribution. Based on the comparison of our strategy with wellknown
page-replacement techniques, like LRU-based and Prioritybased
buffers, we conclude that MONPAR behaves better than its
competitors for small and medium size buffers under all used query-distributions.
Abstract: This paper presents the IP traffic analysis. The traffic
was collected from the network of Suranaree University of
Technology using the software based on the Simple Network
Management Protocol (SNMP). In particular, we analyze the
distribution of the aggregated traffic during the hours of peak load
and light load. The traffic profiles including the parameters described
the traffic distributions were derived. From the statistical analysis
applying three different methods, including the Kolmogorov Smirnov
test, Anderson Darling test, and Chi-Squared test, we found that the
IP traffic distribution is a non-normal distribution and the
distributions during the peak load and the light load are different. The
experimental study and analysis show high uncertainty of the IP
traffic.
Abstract: This paper presents an innovative computer system
that contributes to sustainable development of the enterprise. The
research refers to a rethinking of traditional systems of collaboration
and risk assessment, present in any organization, leading to a
sustainable enterprise. This concept integrates emerging tools that
allow the implementation and exploitation of the collective
intelligence of the enterprise, allowing the exchange of contextual,
agile and simplified information, and collaboration with networks of
customers and partners in an environment where risks are controlled.
Risk assessment is done in a systemic way: the enterprise as the
system compared to the contained departments and the enterprise as a
subsystem compared to: families of international standards and
sustainability-s responsibilities. The enterprise, in this systemic
vision, responds to the requirements that any existing system to
operate continuously in an indefinite future without reaching key
resource depletion. The research is done by integrating collaborative
science, engineering, management, psychology, obtaining thus a
cornerstone of sustainable development of the enterprise.
Abstract: The paper presents the multi-element synthetic
transmit aperture (MSTA) method with a small number of elements
transmitting and all elements apertures in medical ultrasound
imaging. As compared to the other methods MSTA allows to
increase the system frame rate and provides the best compromise
between penetration depth and lateral resolution.
In the experiments a 128-element linear transducer array with
0.3 mm pitch excited by a burst pulse of 125 ns duration were used.
The comparison of 2D ultrasound images of tissue mimicking
phantom obtained using the STA and the MSTA methods is
presented to demonstrate the benefits of the second approach. The
results were obtained using SA algorithm with transmit and receive
signals correction based on a single element directivity function.