Abstract: Supply chains are the backbone of trade and
commerce. Their logistics use different transport corridors on regular
basis for operational purpose. The international supply chain
transport corridors include different infrastructure elements (e.g.
weighbridge, package handling equipments, border clearance
authorities, and so on). This paper presents the use of multi-agent
systems (MAS) to model and simulate some aspects of transportation
corridors, and in particular the area of weighbridge resource
optimization for operational profit. An underlying multi-agent model
provides a means of modeling the relationships among stakeholders
in order to enable coordination in a transport corridor environment.
Simulations of the costs of container unloading, reloading, and
waiting time for queuing up tracks have been carried out using data
sets. Results of the simulation provide the potential guidance in
making decisions about optimal service resource allocation in a trade
corridor.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data
Abstract: In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The resulting fully automatic generated news stories have a high resemblance to the style in which the human writer would draw up such a story. Topics include soccer games, stock exchange market reports, and weather forecasts. Each generated text is unique. Readyto-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save timeconsuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist.
Abstract: Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.
Abstract: In addition to environmental parameters like rain,
temperature diseases on crop is a major factor which affects
production quality & quantity of crop yield. Hence disease
management is a key issue in agriculture. For the management of
disease, it needs to be detected at early stage. So, treat it properly &
control spread of the disease. Now a day, it is possible to use the
images of diseased leaf to detect the type of disease by using image
processing techniques. This can be achieved by extracting features
from the images which can be further used with classification
algorithms or content based image retrieval systems. In this paper,
color image is used to extract the features such as mean and standard
deviation after the process of region cropping. The selected features
are taken from the cropped image with different image size samples.
Then, the extracted features are taken in to the account for
classification using Fuzzy Inference System (FIS).
Abstract: In this paper, we propose a new method for threedimensional
object indexing based on D.A.M.C-S.H.C descriptor
(Direct and Analytical Method for Calculating the Spherical
Harmonics Coefficients). For this end, we propose a direct
calculation of the coefficients of spherical harmonics with perfect
precision. The aims of the method are to minimize, the processing
time on the 3D objects database and the searching time of similar
objects to a request object.
Firstly we start by defining the new descriptor using a new
division of 3-D object in a sphere. Then we define a new distance
which will be tested and prove his efficiency in the search for similar
objects in the database in which we have objects with very various
and important size.
Abstract: This paper focuses on I/O optimizations of N-hybrid
(New-Form of hybrid), which provides a hybrid file system space
constructed on SSD and HDD. Although the promising potentials of
SSD, such as the absence of mechanical moving overhead and high
random I/O throughput, have drawn a lot of attentions from IT
enterprises, its high ratio of cost/capacity makes it less desirable to
build a large-scale data storage subsystem composed of only SSDs. In
this paper, we present N-hybrid that attempts to integrate the strengths
of SSD and HDD, to offer a single, large hybrid file system space.
Several experiments were conducted to verify the performance of
N-hybrid.
Abstract: The legality of some countries or agencies’ acts to spy
on personal phone calls of the public became a hot topic to many
social groups’ talks. It is believed that this act is considered an
invasion to someone’s privacy. Such act may be justified if it is
singling out specific cases but to spy without limits is very
unacceptable. This paper discusses the needs for not only a simple
and light weight technique to secure mobile voice calls but also a
technique that is independent from any encryption standard or library.
It then presents and tests one encrypting algorithm that is based of
Frequency scrambling technique to show fair and delay-free process
that can be used to protect phone calls from such spying acts.
Abstract: This paper describes the main features of a knowledge-based system evaluation method. System evaluation is placed in the context of a hybrid legal decision-support system, Advisory Support for Home Settlement in Divorce (ASHSD). Legal knowledge for ASHSD is represented in two forms, as rules and previously decided cases. Besides distinguishing the two different forms of knowledge representation, the paper outlines the actual use of these forms in a computational framework that is designed to generate a plausible solution for a given case, by using rule-based reasoning (RBR) and case-based reasoning (CBR) in an integrated environment. The nature of suitability assessment of a solution has been considered as a multiple criteria decision-making process in ASHAD evaluation. The evaluation was performed by a combination of discussions and questionnaires with different user groups. The answers to questionnaires used in this evaluations method have been measured as a fuzzy linguistic term. The finding suggests that fuzzy linguistic evaluation is practical and meaningful in knowledge-based system development purpose.
Abstract: In this paper a new methodology for vendor selection
and supply quotas determination (VSSQD) is proposed. The problem
of VSSQD is solved by the model that combines revised weighting
method for determining the objective function coefficients, and a
multiple objective linear programming (MOLP) method based on the
cooperative game theory for VSSQD. The criteria used for VSSQD
are: (1) purchase costs and (2) product quality supplied by individual
vendors. The proposed methodology has been tested on the example
of flour purchase for a bakery with two decision makers.
Abstract: This paper presents development results of usage of
C-OTDR monitoring systems for rail traffic management. The COTDR
method is based on vibrosensitive properties of optical fibers.
Analysis of Rayleigh backscattering radiation parameters changes
which take place due to microscopic seismoacoustic impacts on the
optical fiber allows to determine seismoacoustic emission source
positions and to identify their types. This approach proved successful
for rail traffic management (moving block system, weigh- in-motion
system etc.).
Abstract: This paper presents the performance of Integrated
Bacterial Foraging Optimization and Particle Swarm Optimization
(IBFO_PSO) technique in MANET routing. The BFO is a bio-inspired
algorithm, which simulates the foraging behavior of bacteria.
It is effectively applied in improving the routing performance in
MANET. In results, it is proved that the PSO integrated with BFO
reduces routing delay, energy consumption and communication
overhead.
Abstract: Model predictive control is a kind of optimal feedback
control in which control performance over a finite future is optimized
with a performance index that has a moving initial time and a moving
terminal time. This paper examines the stability of model predictive
control for linear discrete-time systems with additive stochastic
disturbances. A sufficient condition for the stability of the closed-loop
system with model predictive control is derived by means of a linear
matrix inequality. The objective of this paper is to show the results
of computational simulations in order to verify the effectiveness of
the obtained stability condition.
Abstract: The new era of digital communication has brought up
many challenges that network operators need to overcome. The high
demand of mobile data rates require improved networks, which is a
challenge for the operators in terms of maintaining the quality of
experience (QoE) for their consumers. In live video transmission,
there is a sheer need for live surveillance of the videos in order to
maintain the quality of the network. For this purpose objective
algorithms are employed to monitor the quality of the videos that are
transmitted over a network. In order to test these objective algorithms,
subjective quality assessment of the streamed videos is required, as the
human eye is the best source of perceptual assessment. In this paper we
have conducted subjective evaluation of videos with varying spatial
and temporal impairments. These videos were impaired with frame
freezing distortions so that the impact of frame freezing on the quality
of experience could be studied. We present subjective Mean Opinion
Score (MOS) for these videos that can be used for fine tuning the
objective algorithms for video quality assessment.
Abstract: In recent years, multi-antenna techniques are being considered as a potential solution to increase the flow of future wireless communication systems. The objective of this article is to study the emission and reception system MIMO (Multiple Input Multiple Output), and present the different reception decoding techniques. First we will present the least complex technical, linear receivers such as the zero forcing equalizer (ZF) and minimum mean squared error (MMSE). Then a nonlinear technique called ordered successive cancellation of interferences (OSIC) and the optimal detector based on the maximum likelihood criterion (ML), finally, we simulate the associated decoding algorithms for MIMO system such as ZF, MMSE, OSIC and ML, thus a comparison of performance of these algorithms in MIMO context.
Abstract: In this paper, we propose moving object detection
method which is helpful for driver to safely take his/her car out of
parking lot. When moving objects such as motorbikes, pedestrians,
the other cars and some obstacles are detected at the rear-side of host
vehicle, the proposed algorithm can provide to driver warning. We
assume that the host vehicle is just before departure. Gaussian
Mixture Model (GMM) based background subtraction is basically
applied. Pre-processing such as smoothing and post-processing as
morphological filtering are added. We examine “which color space
has better performance for detection of moving objects?” Three color
spaces including RGB, YCbCr, and Y are applied and compared, in
terms of detection rate. Through simulation, we prove that RGB
space is more suitable for moving object detection based on
background subtraction.
Abstract: This paper presents the ‘Eye Ball Motion Controlled
Wheelchair using IR Sensors’ for the elderly and differently abled
people. In this eye tracking based technology, three Proximity
Infrared (IR) sensor modules are mounted on an eye frame to trace
the movement of the iris. Since, IR sensors detect only white objects;
a unique sequence of digital bits is generated corresponding to each
eye movement. These signals are then processed via a micro
controller IC (PIC18F452) to control the motors of the wheelchair.
The potential and efficiency of previously developed rehabilitation
systems that use head motion, chin control, sip-n-puff control, voice
recognition, and EEG signals variedly have also been explored in
detail. They were found to be inconvenient as they served either
limited usability or non-affordability. After multiple regression
analyses, the proposed design was developed as a cost-effective,
flexible and stream-lined alternative for people who have trouble
adopting conventional assistive technologies.
Abstract: Cloud computing has emerged as a promising
direction for cost efficient and reliable service delivery across data
communication networks. The dynamic location of service facilities
and the virtualization of hardware and software elements are stressing
the communication networks and protocols, especially when data
centres are interconnected through the internet. Although the
computing aspects of cloud technologies have been largely
investigated, lower attention has been devoted to the networking
services without involving IT operating overhead. Cloud computing
has enabled elastic and transparent access to infrastructure services
without involving IT operating overhead. Virtualization has been a
key enabler for cloud computing. While resource virtualization and
service abstraction have been widely investigated, networking in
cloud remains a difficult puzzle. Even though network has significant
role in facilitating hybrid cloud scenarios, it hasn't received much
attention in research community until recently. We propose Network
as a Service (NaaS), which forms the basis of unifying public and
private clouds. In this paper, we identify various challenges in
adoption of hybrid cloud. We discuss the design and implementation
of a cloud platform.
Abstract: Several parameters are established in order to measure
biodiesel quality. One of them is the iodine value, which is an
important parameter that measures the total unsaturation within a
mixture of fatty acids. Limitation of unsaturated fatty acids is
necessary since warming of higher quantity of these ones ends in
either formation of deposits inside the motor or damage of lubricant.
Determination of iodine value by official procedure tends to be very
laborious, with high costs and toxicity of the reagents, this study uses
artificial neural network (ANN) in order to predict the iodine value
property as an alternative to these problems. The methodology of
development of networks used 13 esters of fatty acids in the input
with convergence algorithms of back propagation of back
propagation type were optimized in order to get an architecture of
prediction of iodine value. This study allowed us to demonstrate the
neural networks’ ability to learn the correlation between biodiesel
quality properties, in this caseiodine value, and the molecular
structures that make it up. The model developed in the study reached
a correlation coefficient (R) of 0.99 for both network validation and
network simulation, with Levenberg-Maquardt algorithm.