Abstract: Software reusability is an essential characteristic of
Component-Based Software (CBS). The component reusability is an
important assess for the effective reuse of components in CBS. The
attributes of reusability proposed by various researchers are studied
and four of them are identified as potential factors affecting
reusability. This paper proposes metric for reusability estimation of
black-box software component along with metrics for Interface
Complexity, Understandability, Customizability and Reliability. An
experiment is performed for estimation of reusability through a case
study on a sample web application using a real world component.
Abstract: Nowadays, the Web has become one of the most
pervasive platforms for information change and retrieval. It collects
the suitable and perfectly fitting information from websites that one
requires. Data mining is the form of extracting data’s available in the
internet. Web mining is one of the elements of data mining
Technique, which relates to various research communities such as
information recovery, folder managing system and simulated
intellects. In this Paper we have discussed the concepts of Web
mining. We contain generally focused on one of the categories of
Web mining, specifically the Web Content Mining and its various
farm duties. The mining tools are imperative to scanning the many
images, text, and HTML documents and then, the result is used by
the various search engines. We conclude by presenting a comparative
table of these tools based on some pertinent criteria.
Abstract: In the cloud computing hierarchy IaaS is the lowest
layer, all other layers are built over it. Thus it is the most important
layer of cloud and requisite more importance. Along with advantages
IaaS faces some serious security related issue. Mainly Security
focuses on Integrity, confidentiality and availability. Cloud
computing facilitate to share the resources inside as well as outside of
the cloud. On the other hand, cloud still not in the state to provide
surety to 100% data security. Cloud provider must ensure that end
user/client get a Quality of Service. In this report we describe
possible aspects of cloud related security.
Abstract: Existing methods of data mining cannot be applied on
spatial data because they require spatial specificity consideration, as
spatial relationships.
This paper focuses on the classification with decision trees, which
are one of the data mining techniques. We propose an extension of
the C4.5 algorithm for spatial data, based on two different approaches
Join materialization and Querying on the fly the different tables.
Similar works have been done on these two main approaches, the
first - Join materialization - favors the processing time in spite of
memory space, whereas the second - Querying on the fly different
tables- promotes memory space despite of the processing time.
The modified C4.5 algorithm requires three entries tables: a target
table, a neighbor table, and a spatial index join that contains the
possible spatial relationship among the objects in the target table and
those in the neighbor table. Thus, the proposed algorithms are applied
to a spatial data pattern in the accidentology domain.
A comparative study of our approach with other works of
classification by spatial decision trees will be detailed.
Abstract: Many studies have revealed the fact of the complexity
of ontology building process. Therefore there is a need for a new
approach which one of that addresses the socio-technical aspects in the
collaboration to reach a consensus. Meta-design approach is
considered applicable as a method in the methodological model of
socio-technical ontology engineering. Principles in the meta-design
framework are applied in the construction phases of the ontology. A
web portal is developed to support the meta-design principles
requirements. To validate the methodological model semantic web
applications were developed and integrated in the portal and also used
as a way to show the usefulness of the ontology. The knowledge based
system will be filled with data of Indonesian medicinal plants. By
showing the usefulness of the developed ontology in a semantic web
application, we motivate all stakeholders to participate in the
development of knowledge based system of medicinal plants in
Indonesia.
Abstract: Wireless sensor network is vulnerable to a wide range
of attacks. Recover secrecy after compromise, to develop technique
that can detect intrusions and able to resilient networks that isolates
the point(s) of intrusion while maintaining network connectivity for
other legitimate users. To define new security metrics to evaluate
collaborative intrusion resilience protocol, by leveraging the sensor
mobility that allows compromised sensors to recover secure state
after compromise. This is obtained with very low overhead and in a
fully distributed fashion using extensive simulations support our
findings.
Abstract: Artificial Immune Systems (AIS), inspired by the
human immune system, are algorithms and mechanisms which are
self-adaptive and self-learning classifiers capable of recognizing and
classifying by learning, long-term memory and association. Unlike
other human system inspired techniques like genetic algorithms and
neural networks, AIS includes a range of algorithms modeling on
different immune mechanism of the body. In this paper, a mechanism
of a human immune system based on apoptosis is adopted to build an
Intrusion Detection System (IDS) to protect computer networks.
Features are selected from network traffic using Fisher Score. Based
on the selected features, the record/connection is classified as either
an attack or normal traffic by the proposed methodology. Simulation
results demonstrates that the proposed AIS based on apoptosis
performs better than existing AIS for intrusion detection.
Abstract: In this work, a Multi-Level Artificial Bee Colony
(called MLABC) for optimizing numerical test functions is presented.
In MLABC, two species are used. The first species employs n
colonies where each of them optimizes the complete solution vector.
The cooperation between these colonies is carried out by exchanging
information through a leader colony, which contains a set of elite
bees. The second species uses a cooperative approach in which the
complete solution vector is divided to k sub-vectors, and each of
these sub-vectors is optimized by a colony. The cooperation between
these colonies is carried out by compiling sub-vectors into the
complete solution vector. Finally, the cooperation between two
species is obtained by exchanging information. The proposed
algorithm is tested on a set of well-known test functions. The results
show that MLABC algorithm provides efficiency and robustness to
solve numerical functions.
Abstract: Maturity models, used descriptively to explain
changes in reality or normatively to guide managers to make
interventions to make organizations more effective and efficient, are
based on the principles of statistical quality control and PDCA
continuous improvement (Plan, Do, Check, Act). Some frameworks
developed over the concept of maturity models include COBIT,
CMM, and ITIL.
This paper presents some limitations of traditional maturity
models, most of them related to the mechanistic and reductionist
principles over which those models are built. As systems theory helps
the understanding of the dynamics of organizations and
organizational change, the development of a systemic maturity model
can help to overcome some of those limitations.
This document proposes a systemic maturity model, based on a
systemic conceptualization of organizations, focused on the study of
the functioning of the parties, the relationships among them, and their
behavior as a whole. The concept of maturity from the system theory
perspective is conceptually defined as an emergent property of the
organization, which arises as a result of the degree of alignment and
integration of their processes. This concept is operationalized through
a systemic function that measures the maturity of organizations, and
finally validated by the measuring of maturity in some organizations.
For its operationalization and validation, the model was applied to
measure the maturity of organizational Governance, Risk and
Compliance (GRC) processes.
Abstract: Key frame extraction methods select the most
representative frames of a video, which can be used in different areas
of video processing such as video retrieval, video summary, and video
indexing. In this paper we present a novel approach for extracting key
frames from video sequences. The frame is characterized uniquely by
his contours which are represented by the dominant blocks. These
dominant blocks are located on the contours and its near textures.
When the video frames have a noticeable changement, its dominant
blocks changed, then we can extracte a key frame. The dominant
blocks of every frame is computed, and then feature vectors are
extracted from the dominant blocks image of each frame and arranged
in a feature matrix. Singular Value Decomposition is used to calculate
sliding windows ranks of those matrices. Finally the computed ranks
are traced and then we are able to extract key frames of a video.
Experimental results show that the proposed approach is robust
against a large range of digital effects used during shot transition.
Abstract: Skin detection is an important task for computer
vision systems. A good method of skin detection means a good and
successful result of the system.
The colour is a good descriptor for image segmentation and
classification; it allows detecting skin colour in the images. The
lighting changes and the objects that have a colour similar than skin
colour make the operation of skin detection difficult.
In this paper, we proposed a method using the YCbCr colour space
for skin detection and lighting effects elimination, then we use the
information of texture to eliminate the false regions detected by the
YCbCr skin model.
Abstract: This paper describes a logical method to enhance
security on the grid computing to restrict the misuse of the grid
resources. This method is an economic and efficient one to avoid the
usage of the special devices. The security issues, techniques and
solutions needed to provide a secure grid computing environment are
described. A well defined process for security management among
the resource accesses and key holding algorithm is also proposed. In
this method, the identity management, access control and
authorization and authentication are effectively handled.
Abstract: Augmented Reality is a technology that involves the
overlay of virtual content, which is context or environment sensitive,
on images of the physical world in real time. This paper presents the
development of a catalog system that facilitates and allows the
creation, publishing, management and exploitation of augmented
multimedia contents and Augmented Reality applications, creating an
own space for anyone that wants to provide information to real
objects in order to edit and share it then online with others. These
spaces would be built for different domains without the initial need of
expert users. Its operation focuses on the context of Web 2.0 or
Social Web, with its various applications, developing contents to
enrich the real context in which human beings act permitting the
evolution of catalog’s contents in an emerging way.
Abstract: The fuzzy composition of objects depicted in images
acquired through MR imaging or the use of bio-scanners has often
been a point of controversy for field experts attempting to effectively
delineate between the visualized objects. Modern approaches in
medical image segmentation tend to consider fuzziness as a
characteristic and inherent feature of the depicted object, instead of
an undesirable trait. In this paper, a novel technique for efficient
image retrieval in the context of images in which segmented objects
are either crisp or fuzzily bounded is presented. Moreover, the
proposed method is applied in the case of multiple, even conflicting,
segmentations from field experts. Experimental results demonstrate
the efficiency of the suggested method in retrieving similar objects
from the aforementioned categories while taking into account the
fuzzy nature of the depicted data.
Abstract: Underwater acoustic network is one of the rapidly
growing areas of research and finds different applications for
monitoring and collecting various data for environmental studies. The
communication among dynamic nodes and high error probability in
an acoustic medium forced to maximize energy consumption in
Underwater Sensor Networks (USN) than in traditional sensor
networks. Developing energy-efficient routing protocol is the
fundamental and a curb challenge because all the sensor nodes are
powered by batteries, and they cannot be easily replaced in UWSNs.
This paper surveys the various recent routing techniques that mainly
focus on energy efficiency.
Abstract: This paper aims to analyze the role of natural
language processing (NLP). The paper will discuss the role in the
context of automated data retrieval, automated question answer, and
text structuring. NLP techniques are gaining wider acceptance in real
life applications and industrial concerns. There are various
complexities involved in processing the text of natural language that
could satisfy the need of decision makers. This paper begins with the
description of the qualities of NLP practices. The paper then focuses
on the challenges in natural language processing. The paper also
discusses major techniques of NLP. The last section describes
opportunities and challenges for future research.
Abstract: An algorithm is a well-defined procedure that takes
some input in the form of some values, processes them and gives the
desired output. It forms the basis of many other algorithms such as
searching, pattern matching, digital filters etc., and other applications
have been found in database systems, data statistics and processing,
data communications and pattern matching. This paper introduces
algorithmic “Enhanced Bidirectional Selection” sort which is
bidirectional, stable. It is said to be bidirectional as it selects two
values smallest from the front and largest from the rear and assigns
them to their appropriate locations thus reducing the number of
passes by half the total number of elements as compared to selection
sort.
Abstract: Search engine plays an important role in internet, to
retrieve the relevant documents among the huge number of web
pages. However, it retrieves more number of documents, which are
all relevant to your search topics. To retrieve the most meaningful
documents related to search topics, ranking algorithm is used in
information retrieval technique. One of the issues in data miming is
ranking the retrieved document. In information retrieval the ranking
is one of the practical problems. This paper includes various Page
Ranking algorithms, page segmentation algorithms and compares
those algorithms used for Information Retrieval. Diverse Page Rank
based algorithms like Page Rank (PR), Weighted Page Rank (WPR),
Weight Page Content Rank (WPCR), Hyperlink Induced Topic
Selection (HITS), Distance Rank, Eigen Rumor, Distance Rank Time
Rank, Tag Rank, Relational Based Page Rank and Query Dependent
Ranking algorithms are discussed and compared.
Abstract: Creating a database scheme is essentially a manual
process. From a requirement specification the information contained
within has to be analyzed and reduced into a set of tables, attributes
and relationships. This is a time consuming process that has to go
through several stages before an acceptable database schema is
achieved. The purpose of this paper is to implement a Natural
Language Processing (NLP) based tool to produce a relational
database from a requirement specification. The Stanford CoreNLP
version 3.3.1 and the Java programming were used to implement the
proposed model. The outcome of this study indicates that a first draft
of a relational database schema can be extracted from a requirement
specification by using NLP tools and techniques with minimum user
intervention. Therefore this method is a step forward in finding a
solution that requires little or no user intervention.
Abstract: Structured Query Language (SQL) is the standard de facto language to access and manipulate data in a relational database. Although SQL is a language that is simple and powerful, most novice users will have trouble with SQL syntax. Thus, we are presenting SQL generator tool which is capable of translating actions and displaying SQL commands and data sets simultaneously. The tool was developed based on Model-View-Controller (MVC) pattern. The MVC pattern is a widely used software design pattern that enforces the separation between the input, processing, and output of an application. Developers take full advantage of it to reduce the complexity in architectural design and to increase flexibility and reuse of code. In addition, we use White-Box testing for the code verification in the Model module.