Abstract: Chemical Reaction Optimization (CRO) is an
optimization metaheuristic inspired by the nature of chemical
reactions as a natural process of transforming the substances from
unstable to stable states. Starting with some unstable molecules with
excessive energy, a sequence of interactions takes the set to a state of
minimum energy. Researchers reported successful application of the
algorithm in solving some engineering problems, like the quadratic
assignment problem, with superior performance when compared with
other optimization algorithms. We adapted this optimization
algorithm to the Printed Circuit Board Drilling Problem (PCBDP)
towards reducing the drilling time and hence improving the PCB
manufacturing throughput. Although the PCBDP can be viewed as
instance of the popular Traveling Salesman Problem (TSP), it has
some characteristics that would require special attention to the
transactions that explore the solution landscape. Experimental test
results using the standard CROToolBox are not promising for
practically sized problems, while it could find optimal solutions for
artificial problems and small benchmarks as a proof of concept.
Abstract: In this paper, we propose a method for three-dimensional
(3-D)-model indexing based on defining a new
descriptor, which we call new descriptor using spherical harmonics.
The purpose of the method is to minimize, the processing time on the
database of objects models and the searching time of similar objects
to request object.
Firstly we start by defining the new descriptor using a new
division of 3-D object in a sphere. Then we define a new distance
which will be used in the search for similar objects in the database.
Abstract: Evaluating the performance of a simulator in the
CAVE has to be confirmed by encouraging people to live the
experience of virtual reality. In this paper, a detailed procedure of
recording video is presented. Limitations of the experimental device
are firstly exposed. Then, solutions for improving this idea are finally
described.
Abstract: In this paper a new algorithm to generate random
simple polygons from a given set of points in a two dimensional
plane is designed. The proposed algorithm uses a genetic algorithm to
generate polygons with few vertices. A new merge algorithm is
presented which converts any two polygons into a simple polygon.
This algorithm at first changes two polygons into a polygonal chain
and then the polygonal chain is converted into a simple polygon. The
process of converting a polygonal chain into a simple polygon is
based on the removal of intersecting edges. The experiments results
show that the proposed algorithm has the ability to generate a great
number of different simple polygons and has better performance in
comparison to celebrated algorithms such as space partitioning and
steady growth.
Abstract: The main aim of a communication system is to
achieve maximum performance. In Cognitive Radio any user or
transceiver has ability to sense best suitable channel, while channel is
not in use. It means an unlicensed user can share the spectrum of a
licensed user without any interference. Though, the spectrum sensing
consumes a large amount of energy and it can reduce by applying
various artificial intelligent methods for determining proper spectrum
holes. It also increases the efficiency of Cognitive Radio Network
(CRN). In this survey paper we discuss the use of different learning
models and implementation of Artificial Neural Network (ANN) to
increase the learning and decision making capacity of CRN without
affecting bandwidth, cost and signal rate.
Abstract: Spam is any unwanted electronic message or material
in any form posted too many people. As the world is growing as
global world, social networking sites play an important role in
making world global providing people from different parts of the
world a platform to meet and express their views. Among different
social networking sites Facebook become the leading one. With
increase in usage different users start abusive use of Facebook by
posting or creating ways to post spam. This paper highlights the
potential spam types nowadays Facebook users’ faces. This paper
also provide the reason how user become victim to spam attack. A
methodology is proposed in the end discusses how to handle different
types of spam.
Abstract: In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.
Abstract: In-memory database systems are becoming popular
due to the availability and affordability of sufficiently large RAM and
processors in modern high-end servers with the capacity to manage
large in-memory database transactions. While fast and reliable inmemory
systems are still being developed to overcome cache misses,
CPU/IO bottlenecks and distributed transaction costs, disk-based data
stores still serve as the primary persistence. In addition, with the
recent growth in multi-tenancy cloud applications and associated
security concerns, many organisations consider the trade-offs and
continue to require fast and reliable transaction processing of diskbased
database systems as an available choice. For these
organizations, the only way of increasing throughput is by improving
the performance of disk-based concurrency control. This warrants a
hybrid database system with the ability to selectively apply an
enhanced disk-based data management within the context of inmemory
systems that would help improve overall throughput.
The general view is that in-memory systems substantially
outperform disk-based systems. We question this assumption and
examine how a modified variation of access invariance that we call
enhanced memory access, (EMA) can be used to allow very high
levels of concurrency in the pre-fetching of data in disk-based
systems. We demonstrate how this prefetching in disk-based systems
can yield close to in-memory performance, which paves the way for
improved hybrid database systems. This paper proposes a novel EMA
technique and presents a comparative study between disk-based EMA
systems and in-memory systems running on hardware configurations
of equivalent power in terms of the number of processors and their
speeds. The results of the experiments conducted clearly substantiate
that when used in conjunction with all concurrency control
mechanisms, EMA can increase the throughput of disk-based systems
to levels quite close to those achieved by in-memory system. The
promising results of this work show that enhanced disk-based
systems facilitate in improving hybrid data management within the
broader context of in-memory systems.
Abstract: In this study, we propose a novel technique for acoustic
echo suppression (AES) during speech recognition under barge-in
conditions. Conventional AES methods based on spectral subtraction
apply fixed weights to the estimated echo path transfer function
(EPTF) at the current signal segment and to the EPTF estimated until
the previous time interval. However, the effects of echo path changes
should be considered for eliminating the undesired echoes. We
describe a new approach that adaptively updates weight parameters in
response to abrupt changes in the acoustic environment due to
background noises or double-talk. Furthermore, we devised a voice
activity detector and an initial time-delay estimator for barge-in speech
recognition in communication networks. The initial time delay is
estimated using log-spectral distance measure, as well as
cross-correlation coefficients. The experimental results show that the
developed techniques can be successfully applied in barge-in speech
recognition systems.
Abstract: In previous study, technique to estimate a self-location by using a lunar image is proposed.We consider the improvement of the conventional method in consideration of FPGA implementationin this paper. Specifically, we introduce Artificial Bee Colony algorithm for reduction of search time.In addition, we use fixed point arithmetic to enable high-speed operation on FPGA.
Abstract: Operations research science (OR) deals with good
success in developing and applying scientific methods for problem
solving and decision-making. However, by using OR techniques, we
can enhance the use of computer decision support systems to achieve
optimal management for institutions. OR applies comprehensive
analysis including all factors that effect on it and builds mathematical
modeling to solve business or organizational problems. In addition, it
improves decision-making and uses available resources efficiently.
The adoption of OR by universities would definitely contributes to
the development and enhancement of the performance of OR
techniques. This paper provides an understanding of the structures,
approaches and models of OR in problem solving and decisionmaking.
Abstract: Artificial Neural Network (ANN) can be trained using
back propagation (BP). It is the most widely used algorithm for
supervised learning with multi-layered feed-forward networks.
Efficient learning by the BP algorithm is required for many practical
applications. The BP algorithm calculates the weight changes of
artificial neural networks, and a common approach is to use a twoterm
algorithm consisting of a learning rate (LR) and a momentum
factor (MF). The major drawbacks of the two-term BP learning
algorithm are the problems of local minima and slow convergence
speeds, which limit the scope for real-time applications. Recently the
addition of an extra term, called a proportional factor (PF), to the
two-term BP algorithm was proposed. The third increases the speed
of the BP algorithm. However, the PF term also reduces the
convergence of the BP algorithm, and criteria for evaluating
convergence are required to facilitate the application of the three
terms BP algorithm. Although these two seem to be closely related,
as described later, we summarize various improvements to overcome
the drawbacks. Here we compare the different methods of
convergence of the new three-term BP algorithm.
Abstract: Many wireless sensor network applications require
K-coverage of the monitored area. In this paper, we propose a
scalable harmony search based algorithm in terms of execution
time, K-Coverage Enhancement Algorithm (KCEA), it attempts to
enhance initial coverage, and achieve the required K-coverage degree
for a specific application efficiently. Simulation results show that
the proposed algorithm achieves coverage improvement of 5.34%
compared to K-Coverage Rate Deployment (K-CRD), which achieves
1.31% when deploying one additional sensor. Moreover, the proposed
algorithm is more time efficient.
Abstract: Nowadays social media information, such as news,
links, images, or VDOs, is shared extensively. However, the
effectiveness of disseminating information through social media
lacks in quality: less fact checking, more biases, and several rumors.
Many researchers have investigated about credibility on Twitter, but
there is no the research report about credibility information on
Facebook. This paper proposes features for measuring credibility on
Facebook information. We developed the system for credibility on
Facebook. First, we have developed FB credibility evaluator for
measuring credibility of each post by manual human’s labelling. We
then collected the training data for creating a model using Support
Vector Machine (SVM). Secondly, we developed a chrome extension
of FB credibility for Facebook users to evaluate the credibility of
each post. Based on the usage analysis of our FB credibility chrome
extension, about 81% of users’ responses agree with suggested
credibility automatically computed by the proposed system.
Abstract: Container handling problems at container terminals
are NP-hard problems. This paper presents an approach using
discrete-event simulation modeling to optimize solution for storage
space allocation problem, taking into account all various interrelated
container terminal handling activities. The proposed approach is
applied on a real case study data of container terminal at Alexandria
port. The computational results show the effectiveness of the
proposed model for optimization of storage space allocation in
container terminal where 54% reduction in containers handling time
in port is achieved.
Abstract: In this paper a new design of a broadband microwave
power limiter is presented and validated into simulation by using
ADS software (Advanced Design System) from Agilent technologies.
The final circuit is built on microstrip lines by using identical Zero
Bias Schottky diodes. The power limiter is designed by Associating 3
stages Schottky diodes. The obtained simulation results permit to
validate this circuit with a threshold input power level of 0 dBm until
a maximum input power of 30 dBm.
Abstract: In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation,
style, illumination, and can suffer from perspective distortion.
Pre-processing is performed to make the characters scale and
rotation invariant. Since text degradations can not be appropriately
defined using well-known geometric transformations such
as translation, rotation, affine transformation and shearing, we
use the whole character black pixels as our feature vector.
Classification is performed with minimum distance classifier
using the maximum likelihood criterion, which delivers very
promising Character Recognition Rate (CRR) of 89%. We
achieve considerably higher Word Recognition Rate (WRR) of
99% when using lower level linguistic knowledge about product
words during the recognition process.
Abstract: The focal aspire of e-Government (eGovt) is to offer
citizen-centered service delivery. Accordingly, the citizenry
consumes services from multiple government agencies through
national portal. Thus, eGovt is an enterprise with the primary
business motive of transparent, efficient and effective public services
to its citizenry and its logical structure is the eGovernment Enterprise
Architecture (eGEA). Since eGovt is IT oriented multifaceted
service-centric system, EA doesn’t do much on an automated
enterprise other than the business artifacts. Service-Oriented
Architecture (SOA) manifestation led some governments to pertain
this in their eGovts, but it limits the source of business artifacts. The
concurrent use of EA and SOA in eGovt executes interoperability and
integration and leads to Service-Oriented e-Government Enterprise
(SOeGE). Consequently, agile eGovt system becomes a reality. As an
IT perspective eGovt comprises of centralized public service artifacts
with the existing application logics belong to various departments at
central, state and local level. The eGovt is renovating to SOeGE by
apply the Service-Orientation (SO) principles in the entire system.
This paper explores IT perspective of SOeGE in India which
encompasses the public service models and illustrated with a case
study the Passport service of India.
Abstract: Real time image and video processing is a demand in
many computer vision applications, e.g. video surveillance, traffic
management and medical imaging. The processing of those video
applications requires high computational power. Thus, the optimal
solution is the collaboration of CPU and hardware accelerators. In
this paper, a Canny edge detection hardware accelerator is proposed.
Edge detection is one of the basic building blocks of video and image
processing applications. It is a common block in the pre-processing
phase of image and video processing pipeline. Our presented
approach targets offloading the Canny edge detection algorithm from
processing system (PS) to programmable logic (PL) taking the
advantage of High Level Synthesis (HLS) tool flow to accelerate the
implementation on Zynq platform. The resulting implementation
enables up to a 100x performance improvement through hardware
acceleration. The CPU utilization drops down and the frame rate
jumps to 60 fps of 1080p full HD input video stream.
Abstract: Learning through creation of contextual games is a
very promising approach when undertaking interdisciplinary and
international group projects. During 2013 and 2014 the authors
organized two intensive student projects. The two projects were in
different countries and different conditions. Between them, the two
projects involved 68 students and 12 mentors from five EU countries
and from various academic disciplines. In this paper we share our
experience of these two projects and we suggest approaches that can
be utilized to strengthen the chances of succeeding in short (12-15
days long) intensive student projects.