Abstract: Computation of facility location problem for every
location in the country is not easy simultaneously. Solving the
problem is described by using cluster computing. A technique is to
design parallel algorithm by using local search with single swap
method in order to solve that problem on clusters. Parallel
implementation is done by the use of portable parallel programming,
Message Passing Interface (MPI), on Microsoft Windows Compute
Cluster. In this paper, it presents the algorithm that used local search
with single swap method and implementation of the system of a
facility to be opened by using MPI on cluster. If large datasets are
considered, the process of calculating a reasonable cost for a facility
becomes time consuming. The result shows parallel computation of
facility location problem on cluster speedups and scales well as
problem size increases.
Abstract: In recent years with the rapid development of Internet and the Web, more and more web applications have been deployed in many fields and organizations such as finance, military, and government. Together with that, hackers have found more subtle ways to attack web applications. According to international statistics, SQL Injection is one of the most popular vulnerabilities of web applications. The consequences of this type of attacks are quite dangerous, such as sensitive information could be stolen or authentication systems might be by-passed. To mitigate the situation, several techniques have been adopted. In this research, a security solution is proposed using Artificial Neural Network to protect web applications against this type of attacks. The solution has been experimented on sample datasets and has given promising result. The solution has also been developed in a prototypic web application firewall called ANNbWAF.
Abstract: Information Retrieval has the objective of studying
models and the realization of systems allowing a user to find the
relevant documents adapted to his need of information. The
information search is a problem which remains difficult because the
difficulty in the representing and to treat the natural languages such
as polysemia. Intentional Structures promise to be a new paradigm to
extend the existing documents structures and to enhance the different
phases of documents process such as creation, editing, search and
retrieval. The intention recognition of the author-s of texts can reduce
the largeness of this problem. In this article, we present intentions
recognition system is based on a semi-automatic method of
extraction the intentional information starting from a corpus of text.
This system is also able to update the ontology of intentions for the
enrichment of the knowledge base containing all possible intentions
of a domain. This approach uses the construction of a semi-formal
ontology which considered as the conceptualization of the intentional
information contained in a text. An experiments on scientific
publications in the field of computer science was considered to
validate this approach.
Abstract: In this paper, we present a novel objective nonreference
performance assessment algorithm for image fusion. It takes
into account local measurements to estimate how well the important
information in the source images is represented by the fused image.
The metric is based on the Universal Image Quality Index and uses
the similarity between blocks of pixels in the input images and the
fused image as the weighting factors for the metrics. Experimental
results confirm that the values of the proposed metrics correlate well
with the subjective quality of the fused images, giving a significant
improvement over standard measures based on mean squared error
and mutual information.
Abstract: Computer worm detection is commonly performed by
antivirus software tools that rely on prior explicit knowledge of the
worm-s code (detection based on code signatures). We present an
approach for detection of the presence of computer worms based on
Artificial Neural Networks (ANN) using the computer's behavioral
measures. Identification of significant features, which describe the
activity of a worm within a host, is commonly acquired from security
experts. We suggest acquiring these features by applying feature
selection methods. We compare three different feature selection
techniques for the dimensionality reduction and identification of the
most prominent features to capture efficiently the computer behavior
in the context of worm activity. Additionally, we explore three
different temporal representation techniques for the most prominent
features. In order to evaluate the different techniques, several
computers were infected with five different worms and 323 different
features of the infected computers were measured. We evaluated
each technique by preprocessing the dataset according to each one
and training the ANN model with the preprocessed data. We then
evaluated the ability of the model to detect the presence of a new
computer worm, in particular, during heavy user activity on the
infected computers.
Abstract: With the widespread growth of applications of
Wireless Sensor Networks (WSNs), the need for reliable security
mechanisms these networks has increased manifold. Many security
solutions have been proposed in the domain of WSN so far. These
solutions are usually based on well-known cryptographic
algorithms.
In this paper, we have made an effort to survey well known
security issues in WSNs and study the behavior of WSN nodes that
perform public key cryptographic operations. We evaluate time
and power consumption of public key cryptography algorithm for
signature and key management by simulation.
Abstract: In this paper we present an extension to Vision Based
LRTA* (VLRTA*) known as Vision Based Moving Target Search
(VMTS) for capturing unknown moving target in unknown territory
with randomly generated obstacles. Target position is unknown to the
agents and they cannot predict its position using any probability
method. Agents have omni directional vision but can see in one
direction at some point in time. Agent-s vision will be blocked by the
obstacles in the search space so agent can not see through the
obstacles. Proposed algorithm is evaluated on large number of
scenarios. Scenarios include grids of sizes from 10x10 to 100x100.
Grids had obstacles randomly placed, occupying 0% to 50%, in
increments of 10%, of the search space. Experiments used 2 to 9
agents for each randomly generated maze with same obstacle ratio.
Observed results suggests that VMTS is effective in locate target
time, solution quality and virtual target. In addition, VMTS becomes
more efficient if the number of agents is increased with proportion to
obstacle ratio.
Abstract: Many metrics were proposed to evaluate the
characteristics of the analysis and design model of a given product
which in turn help to assess the quality of the product. Function point
metric is a measure of the 'functionality' delivery by the software.
This paper presents an analysis of a set of programs of a project
developed in Cµ through Function Points metric. Function points
are measured for a Data Flow Diagram (DFD) of the case developed
at initial stage. Lines of Codes (LOCs) and possible errors are
calculated with the help of measured Function Points (FPs). The
calculations are performed using suitable established functions.
Calculated LOCs and errors are compared with actual LOCs and
errors found at the time of analysis & design review, implementation
and testing. It has been observed that actual found errors are more
than calculated errors. On the basis of analysis and observations,
authors conclude that function point provides useful insight and helps
to analyze the drawbacks in the development process.
Abstract: The Cluster Dimension of a network is defined as, which is the minimum cardinality of a subset S of the set of nodes having the property that for any two distinct nodes x and y, there exist the node Si, s2 (need not be distinct) in S such that ld(x,s1) — d(y, s1)1 > 1 and d(x,s2) < d(x,$) for all s E S — {s2}. In this paper, strictly non overlap¬ping clusters are constructed. The concept of LandMarks for Unique Addressing and Clustering (LMUAC) routing scheme is developed. With the help of LMUAC routing scheme, It is shown that path length (upper bound)PLN,d < PLD, Maximum memory space requirement for the networkMSLmuAc(Az) < MSEmuAc < MSH3L < MSric and Maximum Link utilization factor MLLMUAC(i=3) < MLLMUAC(z03) < M Lc
Abstract: As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Abstract: A new and highly efficient architecture for elliptic curve scalar point multiplication which is optimized for a binary field recommended by NIST and is well-suited for elliptic curve cryptographic (ECC) applications is presented. To achieve the maximum architectural and timing improvements we have reorganized and reordered the critical path of the Lopez-Dahab scalar point multiplication architecture such that logic structures are implemented in parallel and operations in the critical path are diverted to noncritical paths. With G=41, the proposed design is capable of performing a field multiplication over the extension field with degree 163 in 11.92 s with the maximum achievable frequency of 251 MHz on Xilinx Virtex-4 (XC4VLX200) while 22% of the chip area is occupied, where G is the digit size of the underlying digit-serial finite field multiplier.
Abstract: This work develops a novel intelligent “model of dynamic decision-making" usingcell assemblies network architecture in robot's movement. The “model of dynamic decision-making" simulates human decision-making, and follows commands to make the correct decisions. The cell assemblies approach consisting of fLIF neurons was used to implement tasks for finding targets and avoiding obstacles. Experimental results show that the cell assemblies approach of can be employed to efficiently complete finding targets and avoiding obstacles tasks and can simulate the human thinking and the mode of information transactions.
Abstract: Recently, content delivery services have grown rapidly
over the Internet. For ASPs (Application Service Provider) providing
content delivery services, P2P architecture is beneficial to reduce
outgoing traffic from content servers. On the other hand, ISPs are
suffering from the increase in P2P traffic. The P2P traffic is
unnecessarily redundant because the same content or the same
fractions of content are transferred through an inter-ISP link several
times. Subscriber ISPs have to pay a transit fee to upstream ISPs based
on the volume of inter-ISP traffic. In order to solve such problems,
several works have been done for the purpose of P2P traffic reduction.
However, these existing works cannot control the traffic volume of a
certain link. In order to solve such an ISP-s operational requirement,
we propose a method to control traffic volume for a link within a
preconfigured upper bound value. We evaluated that the proposed
method works well by conducting a simulation on a 1,000-user scale.
We confirm that the traffic volume could be controlled at a lower level
than the upper bound for all evaluated conditions. Moreover, our
method could control the traffic volume at 98.95% link usage against
the target value.
Abstract: Jayanti-s algorithm is one of the best known abortable mutual exclusion algorithms. This work is an attempt to overcome an already known limitation of the algorithm while preserving its all important properties and elegance. The limitation is that the token number used to assign process identification number to new incoming processes is unbounded. We have used a suitably adapted alternative data structure, in order to completely eliminate the use of token number, in the algorithm.
Abstract: The present paper is oriented to classification and application of agent technique in simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. The main ideas root in the fact that the best way for description of computer simulation models is the technique of describing the simulated system itself (and the translation into the computer code is provided as automatic), and that the anticipation itself is often nested.
Abstract: The localization of software products is essential for reaching the users of the international market. An important task for this is the translation of the user interface into local national languages. As graphical interfaces are usually optimized for the size of the texts in the original language, after the translation certain user controls (e.g. text labels and buttons in dialogs) may grow in such a manner that they slip above each other. This not only causes an unpleasant appearance but also makes the use of the program more difficult (or even impossible) which implies that the arrangement of the controls must be corrected subsequently. The correction should preserve the original structure of the interface (e.g. the relation of logically coherent controls), furthermore, it is important to keep the nicely proportioned design: the formation of large empty areas should be avoided. This paper describes an algorithm that automatically rearranges the controls of a graphical user interface based on the principles above. The algorithm has been implemented and integrated into a translation support system and reached results pleasant for the human eye in most test cases.
Abstract: Databases have become ubiquitous. Almost all IT applications are storing into and retrieving information from databases. Retrieving information from the database requires knowledge of technical languages such as Structured Query Language (SQL). However majority of the users who interact with the databases do not have a technical background and are intimidated by the idea of using languages such as SQL. This has led to the development of a few Natural Language Database Interfaces (NLDBIs). A NLDBI allows the user to query the database in a natural language. This paper highlights on architecture of new NLDBI system, its implementation and discusses on results obtained. In most of the typical NLDBI systems the natural language statement is converted into an internal representation based on the syntactic and semantic knowledge of the natural language. This representation is then converted into queries using a representation converter. A natural language query is translated to an equivalent SQL query after processing through various stages. The work has been experimented on primitive database queries with certain constraints.
Abstract: Providing Services at Home has become over the last
few years a very dynamic and promising technological domain. It is
likely to enable wide dissemination of secure and automated living
environments. We propose a methodology for identifying threats to
Services at Home Delivery systems, as well as a threat analysis
of a multi-provider Home Gateway architecture. This methodology
is based on a dichotomous positive/preventive study of the target
system: it aims at identifying both what the system must do, and
what it must not do. This approach completes existing methods with
a synthetic view of potential security flaws, thus enabling suitable
measures to be taken into account. Security implications of the
evolution of a given system become easier to deal with. A prototype
is built based on the conclusions of this analysis.
Abstract: Congestion control is one of the fundamental issues in computer networks. Without proper congestion control mechanisms there is the possibility of inefficient utilization of resources, ultimately leading to network collapse. Hence congestion control is an effort to adapt the performance of a network to changes in the traffic load without adversely affecting users perceived utilities. AIMD (Additive Increase Multiplicative Decrease) is the best algorithm among the set of liner algorithms because it reflects good efficiency as well as good fairness. Our control model is based on the assumption of the original AIMD algorithm; we show that both efficiency and fairness of AIMD can be improved. We call our approach is New AIMD. We present experimental results with TCP that match the expectation of our theoretical analysis.
Abstract: The current speech interfaces in many military
applications may be adequate for native speakers. However,
the recognition rate drops quite a lot for non-native speakers
(people with foreign accents). This is mainly because the nonnative
speakers have large temporal and intra-phoneme
variations when they pronounce the same words. This
problem is also complicated by the presence of large
environmental noise such as tank noise, helicopter noise, etc.
In this paper, we proposed a novel continuous acoustic feature
adaptation algorithm for on-line accent and environmental
adaptation. Implemented by incremental singular value
decomposition (SVD), the algorithm captures local acoustic
variation and runs in real-time. This feature-based adaptation
method is then integrated with conventional model-based
maximum likelihood linear regression (MLLR) algorithm.
Extensive experiments have been performed on the NATO
non-native speech corpus with baseline acoustic model trained
on native American English. The proposed feature-based
adaptation algorithm improved the average recognition
accuracy by 15%, while the MLLR model based adaptation
achieved 11% improvement. The corresponding word error
rate (WER) reduction was 25.8% and 2.73%, as compared to
that without adaptation. The combined adaptation achieved
overall recognition accuracy improvement of 29.5%, and
WER reduction of 31.8%, as compared to that without
adaptation.