Abstract: These days customer satisfaction plays vital role in
any business. When customer searches for a product, significantly a
junk of irrelevant information is what is given, leading to customer
dissatisfaction. To provide exactly relevant information on the
searched product, we are proposing a model of KaaS (Knowledge as
a Service), which pre-processes the information using decision
making paradigm using Multi-agents.
Information obtained from various sources is taken to derive
knowledge and they are linked to Cloud to capture new idea. The
main focus of this work is to acquire relevant information
(knowledge) related to product, then convert this knowledge into a
service for customer satisfaction and deploy on cloud.
For achieving these objectives we are have opted to use multi
agents. They are communicating and interacting with each other,
manipulate information, provide knowledge, to take decisions. The
paper discusses about KaaS as an intelligent approach for Knowledge
acquisition.
Abstract: The aim of software maintenance is to maintain
the software system in accordance with advancement in software
and hardware technology. One of the early works on software
maintenance is to extract information at higher level of abstraction. In
this paper, we present the process of how to design an information
extraction tool for software maintenance. The tool can extract the
basic information from old programs such as about variables, based
classes, derived classes, objects of classes, and functions. The tool
have two main parts; the lexical analyzer module that can read the
input file character by character, and the searching module which
users can get the basic information from the existing programs. We
implemented this tool for a patterned sub-C++ language as an input
file.
Abstract: An artificial neural network is a mathematical model
inspired by biological neural networks. There are several kinds of
neural networks and they are widely used in many areas, such as:
prediction, detection, and classification. Meanwhile, in day to day life,
people always have to make many difficult decisions. For example,
the coach of a soccer club has to decide which offensive player
to be selected to play in a certain game. This work describes a
novel Neural Network using a combination of the General Regression
Neural Network and the Probabilistic Neural Networks to help a
soccer coach make an informed decision.
Abstract: Wireless Sensor Networks (WSNs) have wide variety
of applications and provide limitless future potentials. Nodes in
WSNs are prone to failure due to energy depletion, hardware failure,
communication link errors, malicious attacks, and so on. Therefore,
fault tolerance is one of the critical issues in WSNs. We study how
fault tolerance is addressed in different applications of WSNs. Fault
tolerant routing is a critical task for sensor networks operating in
dynamic environments. Many routing, power management, and data
dissemination protocols have been specifically designed for WSNs
where energy awareness is an essential design issue. The focus,
however, has been given to the routing protocols which might differ
depending on the application and network architecture.
Abstract: Cloud computing is a new technology in industry and
academia. The technology has grown and matured in last half decade
and proven their significant role in changing environment of IT
infrastructure where cloud services and resources are offered over the
network. Cloud technology enables users to use services and
resources without being concerned about the technical implications of
technology. There are substantial research work has been performed
for the usage of cloud computing in educational institutes and
majority of them provides cloud services over high-end blade servers
or other high-end CPUs. However, this paper proposes a new stack
called “CiCKAStack” which provide cloud services over unutilized
computing resources, named as commodity computers.
“CiCKAStack” provides IaaS and PaaS using underlying commodity
computers. This will not only increasing the utilization of existing
computing resources but also provide organize file system, on
demand computing resource and design and development
environment.
Abstract: Today’s VLSI networks demands for high speed. And
in this work the compact form mathematical model for current mode
signalling in VLSI interconnects is presented.RLC interconnect line
is modelled using characteristic impedance of transmission line and
inductive effect. The on-chip inductance effect is dominant at lower
technology node is emulated into an equivalent resistance. First order
transfer function is designed using finite difference equation, Laplace
transform and by applying the boundary conditions at the source and
load termination. It has been observed that the dominant pole
determines system response and delay in the proposed model. The
novel proposed current mode model shows superior performance as
compared to voltage mode signalling. Analysis shows that current
mode signalling in VLSI interconnects provides 2.8 times better
delay performance than voltage mode. Secondly the damping factor
of a lumped RLC circuit is shown to be a useful figure of merit.
Abstract: An innovative concept called “Flexy-Energy” is developing at 2iE. This concept aims to produce electricity at lower cost by smartly mix different available energy sources in accordance to the load profile of the region. With a higher solar irradiation and due to the fact that Diesel generator are massively used in sub-Saharan rural areas, PV/Diesel hybrid systems could be a good application of this concept and a good solution to electrify this region, provided they are reliable, cost effective and economically attractive to investors. Presentation of the developed approach is the aims of this paper. The PV/Diesel hybrid system designed consists to produce electricity and/or heat from a coupling between Diesel Diesel generators and PV panels without batteries storage, while ensuring the substitution of gasoil by bio-fuels available in the area where the system will be installed. The optimal design of this system is based on his technical performances; the Life Cycle Cost (LCC) and Levelized Cost of Energy are developed and use as economic criteria. The Net Present Value (NPV), the internal rate of return (IRR) and the discounted payback (DPB) are also evaluated according to dual electricity pricing (in sunny and unsunny hours). The PV/Diesel hybrid system obtained is compared to the standalone Diesel Diesel generators. The approach carried out in this paper has been applied to Siby village in Mali (Latitude 12 ° 23'N 8 ° 20'W) with 295 kWh as daily demand.This approach provides optimal physical characteristics (size of the components, number of component) and dynamical characteristics in real time (number of Diesel generator on, their load rate, fuel specific consumptions, and PV penetration rate) of the system. The system obtained is slightly cost effective; but could be improved with optimized tariffing strategies.
Abstract: Generating random numbers are mainly used to create
secret keys or random sequences. It can be carried out by various
techniques. In this paper we present a very simple and efficient
pseudo random number generator (PRNG) based on chaotic maps
and S-Box tables. This technique adopted two main operations one to
generate chaotic values using two logistic maps and the second to
transform them into binary words using random S-Box tables.
The simulation analysis indicates that our PRNG possessing
excellent statistical and cryptographic properties.
Abstract: Data mining idea is mounting rapidly in admiration
and also in their popularity. The foremost aspire of data mining
method is to extract data from a huge data set into several forms that
could be comprehended for additional use. The data mining is a
technology that contains with rich potential resources which could be
supportive for industries and businesses that pay attention to collect
the necessary information of the data to discover their customer’s
performances. For extracting data there are several methods are
available such as Classification, Clustering, Association,
Discovering, and Visualization… etc., which has its individual and
diverse algorithms towards the effort to fit an appropriate model to
the data. STATISTICA mostly deals with excessive groups of data
that imposes vast rigorous computational constraints. These results
trials challenge cause the emergence of powerful STATISTICA Data
Mining technologies. In this survey an overview of the STATISTICA
software is illustrated along with their significant features.
Abstract: Moodle is an open source learning management
system that enables creation of a powerful and flexible learning
environment. Many organizations, especially learning institutions
have customized Moodle open source LMS for their own use. In
general open source LMSs are of great interest due to many
advantages they offer in terms of cost, usage and freedom to
customize to fit a particular context. Tanzania Secondary School e-
Learning (TanSSe-L) system is the learning management system for
Tanzania secondary schools. TanSSe-L system was developed using
a number of methods, one of them being customization of Moodle
Open Source LMS. This paper presents few areas on the way Moodle
OS LMS was customized to produce a functional TanSSe-L system
fitted to the requirements and specifications of Tanzania secondary
schools’ context.
Abstract: This paper describes an analysis of Yacht Simulator
international trends and also explains about Yacht. The results are
summarized as follows. Attached to the cockpit are sensors that feed
-back information on rudder angle, boat heel angle and mainsheet
tension to the computer. Energy expenditure of the sailor measure
indirectly using expired gas analysis for the measurement of VO2 and
VCO2. At sea course configurations and wind conditions can be preset
to suit any level of sailor from complete beginner to advanced sailor.
Abstract: In this paper, we propose an intelligent system that is
used for monitoring the health conditions of patients. Monitoring the
health condition of patients is a complex problem that involves
different medical units and requires continuous monitoring especially
in rural areas because of inadequate number of available specialized
physicians. The proposed system will improve patient care and drive
costs down comparing to the existing system in Jordan. The proposed
system will be the start point to faster and improve the
communication between different units in the health system in
Jordan. Connecting patients and their physicians beyond hospital
doors regarding their geographical area is an important issue in
developing the health system in Jordan. The ability of making
medical decisions, the quality of medical is expected to be improved.
Abstract: The Great East Japan Earthquake occurred at 14:46 on Friday, March 11, 2011. It was the most powerful known earthquake to have hit Japan. The earthquake triggered extremely destructive tsunami waves of up to 40.5 meters in height. We focus on the ship’s evacuation from tsunami. Then we analyze about ships evacuation from tsunami using multi-agent simulation and we want to prepare for a coming earthquake. We developed a simulation model of ships that set sail from the port in order to evacuate from the tsunami considering the ship carrying dangerous goods.
Abstract: Verification and Validation of Simulated Process
Model is the most important phase of the simulator life cycle.
Evaluation of simulated process models based on Verification and
Validation techniques checks the closeness of each component model
(in a simulated network) with the real system/process with respect to
dynamic behaviour under steady state and transient conditions. The
process of Verification and Validation helps in qualifying the process
simulator for the intended purpose whether it is for providing
comprehensive training or design verification. In general, model
verification is carried out by comparison of simulated component
characteristics with the original requirement to ensure that each step
in the model development process completely incorporates all the
design requirements. Validation testing is performed by comparing
the simulated process parameters to the actual plant process
parameters either in standalone mode or integrated mode.
A Full Scope Replica Operator Training Simulator for PFBR -
Prototype Fast Breeder Reactor has been developed at IGCAR,
Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder
Reactor Simulator) where in the main participants are
engineers/experts belonging to Modeling Team, Process Design and
Instrumentation & Control design team. This paper discusses about
the Verification and Validation process in general, the evaluation
procedure adopted for PFBR operator training Simulator, the
methodology followed for verifying the models, the reference
documents and standards used etc. It details out the importance of
internal validation by design experts, subsequent validation by
external agency consisting of experts from various fields, model
improvement by tuning based on expert’s comments, final
qualification of the simulator for the intended purpose and the
difficulties faced while co-coordinating various activities.
Abstract: Fast changing knowledge systems on the Internet can
be accessed more efficiently with the help of automatic document
summarization and updating techniques. The aim of multi-document
update summary generation is to construct a summary unfolding the
mainstream of data from a collection of documents based on the
hypothesis that the user has already read a set of previous documents.
In order to provide a lot of semantic information from the documents,
deeper linguistic or semantic analysis of the source documents were
used instead of relying only on document word frequencies to select
important concepts. In order to produce a responsive summary,
meaning oriented structural analysis is needed. To address this issue,
the proposed system presents a document summarization approach
based on sentence annotation with aspects, prepositions and named
entities. Semantic element extraction strategy is used to select
important concepts from documents which are used to generate
enhanced semantic summary.
Abstract: Clustering involves the partitioning of n objects into k
clusters. Many clustering algorithms use hard-partitioning techniques
where each object is assigned to one cluster. In this paper we propose
an overlapping algorithm MCOKE which allows objects to belong to
one or more clusters. The algorithm is different from fuzzy clustering
techniques because objects that overlap are assigned a membership
value of 1 (one) as opposed to a fuzzy membership degree. The
algorithm is also different from other overlapping algorithms that
require a similarity threshold be defined a priori which can be
difficult to determine by novice users.
Abstract: In medical imaging, segmentation of different areas of
human body like bones, organs, tissues, etc. is an important issue.
Image segmentation allows isolating the object of interest for further
processing that can lead for example to 3D model reconstruction of
whole organs. Difficulty of this procedure varies from trivial for
bones to quite difficult for organs like liver. The liver is being
considered as one of the most difficult human body organ to segment.
It is mainly for its complexity, shape versatility and proximity of
other organs and tissues. Due to this facts usually substantial user
effort has to be applied to obtain satisfactory results of the image
segmentation. Process of image segmentation then deteriorates from
automatic or semi-automatic to fairly manual one. In this paper,
overview of selected available software applications that can handle
semi-automatic image segmentation with further 3D volume
reconstruction of human liver is presented. The applications are being
evaluated based on the segmentation results of several consecutive
DICOM images covering the abdominal area of the human body.
Abstract: Digital image correlation (DIC) is a contactless fullfield
displacement and strain reconstruction technique commonly
used in the field of experimental mechanics. Comparing with
physical measuring devices, such as strain gauges, which only
provide very restricted coverage and are expensive to deploy widely,
the DIC technique provides the result with full-field coverage and
relative high accuracy using an inexpensive and simple experimental
setup. It is very important to study the natural patterns effect on the
DIC technique because the preparation of the artificial patterns is
time consuming and hectic process. The objective of this research is
to study the effect of using images having natural pattern on the
performance of DIC. A systematical simulation method is used to
build simulated deformed images used in DIC. A parameter (subset
size) used in DIC can have an effect on the processing and accuracy
of DIC and even cause DIC to failure. Regarding to the picture
parameters (correlation coefficient), the higher similarity of two
subset can lead the DIC process to fail and make the result more
inaccurate. The pictures with good and bad quality for DIC methods
have been presented and more importantly, it is a systematic way to
evaluate the quality of the picture with natural patterns before they
install the measurement devices.
Abstract: This paper presents an evolutionary algorithm for
solving multi-objective optimization problems-based artificial neural
network (ANN). The multi-objective evolutionary algorithm used in
this study is genetic algorithm while ANN used is radial basis
function network (RBFN). The proposed algorithm named memetic
elitist Pareto non-dominated sorting genetic algorithm-based RBFN
(MEPGAN). The proposed algorithm is implemented on medical
diseases problems. The experimental results indicate that the
proposed algorithm is viable, and provides an effective means to
design multi-objective RBFNs with good generalization capability
and compact network structure. This study shows that MEPGAN
generates RBFNs coming with an appropriate balance between
accuracy and simplicity, comparing to the other algorithms found in
literature.
Abstract: Motion Tracking and Stereo Vision are complicated,
albeit well-understood problems in computer vision. Existing
softwares that combine the two approaches to perform stereo motion
tracking typically employ complicated and computationally expensive
procedures. The purpose of this study is to create a simple and
effective solution capable of combining the two approaches. The
study aims to explore a strategy to combine the two techniques
of two-dimensional motion tracking using Kalman Filter; and depth
detection of object using Stereo Vision. In conventional approaches
objects in the scene of interest are observed using a single camera.
However for Stereo Motion Tracking; the scene of interest is
observed using video feeds from two calibrated cameras. Using two
simultaneous measurements from the two cameras a calculation for
the depth of the object from the plane containing the cameras is made.
The approach attempts to capture the entire three-dimensional spatial
information of each object at the scene and represent it through a
software estimator object. In discrete intervals, the estimator tracks
object motion in the plane parallel to plane containing cameras and
updates the perpendicular distance value of the object from the plane
containing the cameras as depth. The ability to efficiently track
the motion of objects in three-dimensional space using a simplified
approach could prove to be an indispensable tool in a variety of
surveillance scenarios. The approach may find application from high
security surveillance scenes such as premises of bank vaults, prisons
or other detention facilities; to low cost applications in supermarkets
and car parking lots.