Abstract: In this paper we present a novel approach for face image coding. The proposed method makes a use of the features of video encoders like motion prediction. At first encoder selects appropriate prototype from the database and warps it according to features of encoding face. Warped prototype is placed as first I frame. Encoding face is placed as second frame as P frame type. Information about features positions, color change, selected prototype and data flow of P frame will be sent to decoder. The condition is both encoder and decoder own the same database of prototypes. We have run experiment with H.264 video encoder and obtained results were compared to results achieved by JPEG and JPEG2000. Obtained results show that our approach is able to achieve 3 times lower bitrate and two times higher PSNR in comparison with JPEG. According to comparison with JPEG2000 the bitrate was very similar, but subjective quality achieved by proposed method is better.
Abstract: Intrusion Detection System is significant in network
security. It detects and identifies intrusion behavior or intrusion
attempts in a computer system by monitoring and analyzing the
network packets in real time. In the recent year, intelligent algorithms
applied in the intrusion detection system (IDS) have been an
increasing concern with the rapid growth of the network security.
IDS data deals with a huge amount of data which contains irrelevant
and redundant features causing slow training and testing process,
higher resource consumption as well as poor detection rate. Since the
amount of audit data that an IDS needs to examine is very large even
for a small network, classification by hand is impossible. Hence, the
primary objective of this review is to review the techniques prior to
classification process suit to IDS data.
Abstract: Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.
Abstract: There are various overlay structures that provide
efficient and scalable solutions for point and range query in a peer-topeer
network. Overlay structure based on m-Binary Search Tree
(BST) is one such popular technique. It deals with the division of the
tree into different key intervals and then assigning the key intervals to
a BST. The popularity of the BST makes this overlay structure
vulnerable to different kinds of attacks. Here we present four such
possible attacks namely index poisoning attack, eclipse attack,
pollution attack and syn flooding attack. The functionality of BST is
affected by these attacks. We also provide different security
techniques that can be applied against these attacks.
Abstract: Software project effort estimation is frequently seen
as complex and expensive for individual software engineers.
Software production is in a crisis. It suffers from excessive costs.
Software production is often out of control. It has been suggested that
software production is out of control because we do not measure.
You cannot control what you cannot measure. During last decade, a
number of researches on cost estimation have been conducted. The
metric-set selection has a vital role in software cost estimation
studies; its importance has been ignored especially in neural network
based studies. In this study we have explored the reasons of those
disappointing results and implemented different neural network
models using augmented new metrics. The results obtained are
compared with previous studies using traditional metrics. To be able
to make comparisons, two types of data have been used. The first
part of the data is taken from the Constructive Cost Model
(COCOMO'81) which is commonly used in previous studies and the
second part is collected according to new metrics in a leading
international company in Turkey. The accuracy of the selected
metrics and the data samples are verified using statistical techniques.
The model presented here is based on Multi-Layer Perceptron
(MLP). Another difficulty associated with the cost estimation studies
is the fact that the data collection requires time and care. To make a
more thorough use of the samples collected, k-fold, cross validation
method is also implemented. It is concluded that, as long as an
accurate and quantifiable set of metrics are defined and measured
correctly, neural networks can be applied in software cost estimation
studies with success
Abstract: In the world of Peer-to-Peer (P2P) networking
different protocols have been developed to make the resource sharing
or information retrieval more efficient. The SemPeer protocol is a
new layer on Gnutella that transforms the connections of the nodes
based on semantic information to make information retrieval more
efficient. However, this transformation causes high clustering in the
network that decreases the number of nodes reached, therefore the
probability of finding a document is also decreased. In this paper we
describe a mathematical model for the Gnutella and SemPeer
protocols that captures clustering-related issues, followed by a
proposition to modify the SemPeer protocol to achieve moderate
clustering. This modification is a sort of link management for the
individual nodes that allows the SemPeer protocol to be more
efficient, because the probability of a successful query in the P2P
network is reasonably increased. For the validation of the models, we
evaluated a series of simulations that supported our results.
Abstract: In this paper we present an off line system for the
recognition of the handwritten numeric chains. Our work is divided
in two big parts. The first part is the realization of a recognition
system of the isolated handwritten digits. In this case the study is
based mainly on the evaluation of neural network performances,
trained with the gradient back propagation algorithm. The used
parameters to form the input vector of the neural network are
extracted on the binary images of the digits by several methods: the
distribution sequence, the Barr features and the centred moments of
the different projections and profiles. The second part is the
extension of our system for the reading of the handwritten numeric
chains constituted of a variable number of digits. The vertical
projection is used to segment the numeric chain at isolated digits and
every digit (or segment) will be presented separately to the entry of
the system achieved in the first part (recognition system of the
isolated handwritten digits). The result of the recognition of the
numeric chain will be displayed at the exit of the global system.
Abstract: In this paper, an ultrasonic technique is proposed to
predict oil content in a fresh palm fruit. This is accomplished by
measuring the attenuation based on ultrasonic transmission mode.
Several palm fruit samples with known oil content by Soxhlet
extraction (ISO9001:2008) were tested with our ultrasonic
measurement. Amplitude attenuation data results for all palm samples
were collected. The Feedforward Neural Networks (FNNs) are
applied to predict the oil content for the samples. The Root Mean
Square Error (RMSE) and Mean Absolute Error (MAE) of the FNN
model for predicting oil content percentage are 7.6186 and 5.2287
with the correlation coefficient (R) of 0.9193.
Abstract: The current research paper is an implementation of
Eigen Faces and Karhunen-Loeve Algorithm for face recognition.
The designed program works in a manner where a unique
identification number is given to each face under trial. These faces
are kept in a database from where any particular face can be matched
and found out of the available test faces. The Karhunen –Loeve
Algorithm has been implemented to find out the appropriate right
face (with same features) with respect to given input image as test
data image having unique identification number. The procedure
involves usage of Eigen faces for the recognition of faces.
Abstract: This paper describes the authorization system
architecture for Pervasive Grid environment. It discusses the
characteristics of classical authorization system and requirements of
the authorization system in pervasive grid environment as well.
Based on our analysis of current systems and taking into account the
main requirements of such pervasive environment, we propose new
authorization system architecture as an extension of the existing grid
authorization mechanisms. This architecture not only supports user
attributes but also context attributes which act as a key concept for
context-awareness thought. The architecture allows authorization of
users dynamically when there are changes in the pervasive grid
environment. For this, we opt for hybrid authorization method that
integrates push and pull mechanisms to combine the existing grid
authorization attributes with dynamic context assertions. We will
investigate the proposed architecture using a real testing environment
that includes heterogeneous pervasive grid infrastructures mapped
over multiple virtual organizations. Various scenarios are described
in the last section of the article to strengthen the proposed mechanism
with different facilities for the authorization procedure.
Abstract: We suggest a novel method to incorporate longterm
redundancy (LTR) in signal time domain compression
methods. The proposition is based on block-sorting and curve
simplification. The proposition is illustrated on the ECG
signal as a post-processor for the FAN method. Test
applications on the new so-obtained FAN+ method using the
MIT-BIH database show substantial improvement of the
compression ratio-distortion behavior for a higher quality
reconstructed signal.
Abstract: Software security testing is an important means to ensure software security and trustiness. This paper first mainly discusses the definition and classification of software security testing, and investigates methods and tools of software security testing widely. Then it analyzes and concludes the advantages and disadvantages of various methods and the scope of application, presents a taxonomy of security testing tools. Finally, the paper points out future focus and development directions of software security testing technology.
Abstract: Nowadays increasingly the population makes use of
Information Technology (IT). As such, in recent year the Portuguese
government increased its focus on using the IT for improving
people-s life and began to develop a set of measures to enable the
modernization of the Public Administration, and so reducing the gap
between Public Administration and citizens.Thus the Portuguese
Government launched the Simplex Program. However these
SIMPLEX eGov measures, which have been implemented over the
years, present a serious challenge: how to forecast its impact on
existing Information Systems Architecture (ISA). Thus, this research
is focus in addressing the problem of automating the evaluation of the
actual impact of implementation an eGovSimplification and
Modernization measures in the Information Systems Architecture. To
realize the evaluation we proposes a Framework, which is supported
by some key concepts as: Quality Factors, ISA modeling,
Multicriteria Approach, Polarity Profile and Quality Metrics
Abstract: In order to achieve better road utilization and traffic
efficiency, there is an urgent need for a travel information delivery
mechanism to assist the drivers in making better decisions in the
emerging intelligent transportation system applications. In this paper,
we propose a relayed multicast scheme under heterogeneous networks
for this purpose. In the proposed system, travel information consisting
of summarized traffic conditions, important events, real-time traffic
videos, and local information service contents is formed into layers
and multicasted through an integration of WiMAX infrastructure and
Vehicular Ad hoc Networks (VANET). By the support of adaptive
modulation and coding in WiMAX, the radio resources can be
optimally allocated when performing multicast so as to dynamically
adjust the number of data layers received by the users. In addition to
multicast supported by WiMAX, a knowledge propagation and
information relay scheme by VANET is designed. The experimental
results validate the feasibility and effectiveness of the proposed
scheme.
Abstract: This paper describes a concept of stereotype student
model in adaptive knowledge acquisition e-learning system. Defined
knowledge stereotypes are based on student's proficiency level and
on Bloom's knowledge taxonomy. The teacher module is responsible
for the whole adaptivity process: the automatic generation of
courseware elements, their dynamic selection and sorting, as well as
their adaptive presentation using templates for statements and
questions. The adaptation of courseware is realized according to
student-s knowledge stereotype.
Abstract: Biclustering is a very useful data mining technique for
identifying patterns where different genes are co-related based on a
subset of conditions in gene expression analysis. Association rules
mining is an efficient approach to achieve biclustering as in
BIMODULE algorithm but it is sensitive to the value given to its
input parameters and the discretization procedure used in the
preprocessing step, also when noise is present, classical association
rules miners discover multiple small fragments of the true bicluster,
but miss the true bicluster itself. This paper formally presents a
generalized noise tolerant bicluster model, termed as μBicluster. An
iterative algorithm termed as BIDENS based on the proposed model
is introduced that can discover a set of k possibly overlapping
biclusters simultaneously. Our model uses a more flexible method to
partition the dimensions to preserve meaningful and significant
biclusters. The proposed algorithm allows discovering biclusters that
hard to be discovered by BIMODULE. Experimental study on yeast,
human gene expression data and several artificial datasets shows that
our algorithm offers substantial improvements over several
previously proposed biclustering algorithms.
Abstract: This paper presents a comparison of metaheuristic
algorithms, Genetic Algorithm (GA) and Ant Colony Optimization
(ACO), in producing freeman chain code (FCC). The main problem
in representing characters using FCC is the length of the FCC
depends on the starting points. Isolated characters, especially the
upper-case characters, usually have branches that make the traversing
process difficult. The study in FCC construction using one
continuous route has not been widely explored. This is our
motivation to use the population-based metaheuristics. The
experimental result shows that the route length using GA is better
than ACO, however, ACO is better in computation time than GA.
Abstract: In this paper, we propose a new robust and secure
system that is based on the combination between two different
transforms Discrete wavelet Transform (DWT) and Contourlet
Transform (CT). The combined transforms will compensate the
drawback of using each transform separately. The proposed
algorithm has been designed, implemented and tested successfully.
The experimental results showed that selecting the best sub-band for
embedding from both transforms will improve the imperceptibility
and robustness of the new combined algorithm. The evaluated
imperceptibility of the combined DWT-CT algorithm which gave a
PSNR value 88.11 and the combination DWT-CT algorithm
improves robustness since it produced better robust against Gaussian
noise attack. In addition to that, the implemented system shored a
successful extraction method to extract watermark efficiently.
Abstract: E-services have significantly changed the way of
doing business in recent years. We can, however, observe poor use of
these services. There is a large gap between supply and actual eservices
usage. This is why we started a project to provide an
environment that will encourage the use of e-services. We believe
that only providing e-service does not automatically mean consumers
would use them. This paper shows the origins of our project and its
current position. We discuss the decision of using semantic web
technologies and their potential to improve e-services usage. We also
present current knowledge base and its real-world classification. In the paper, we discuss further work to be done in the project. Current
state of the project is promising.
Abstract: Nowadays viruses use polymorphic techniques to mutate their code on each replication, thus evading detection by antiviruses. However detection by emulation can defeat simple polymorphism: thus metamorphic techniques are used which thoroughly change the viral code, even after decryption. We briefly detail this evolution of virus protection techniques against detection and then study the METAPHOR virus, today's most advanced metamorphic virus.