Abstract: In this paper, a fuzzy algorithm and a fuzzy multicriteria
decision framework are developed and used for a practical
question of optimizing biofuels policy making. The methodological
framework shows how to incorporate fuzzy set theory in a decision
process of finding a sustainable biofuels policy among several policy
options. Fuzzy set theory is used here as a tool to deal with
uncertainties of decision environment, vagueness and ambiguities of
policy objectives, subjectivities of human assessments and imprecise
and incomplete information about the evaluated policy instruments.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: A new approach for protection of power transformer is
presented using a time-frequency transform known as Wavelet transform.
Different operating conditions such as inrush, Normal, load,
External fault and internal fault current are sampled and processed
to obtain wavelet coefficients. Different Operating conditions provide
variation in wavelet coefficients. Features like energy and Standard
deviation are calculated using Parsevals theorem. These features
are used as inputs to PNN (Probabilistic neural network) for fault
classification. The proposed algorithm provides more accurate results
even in the presence of noise inputs and accurately identifies inrush
and fault currents. Overall classification accuracy of the proposed
method is found to be 96.45%. Simulation of the fault (with and
without noise) was done using MATLAB AND SIMULINK software
taking 2 cycles of data window (40 m sec) containing 800 samples.
The algorithm was evaluated by using 10 % Gaussian white noise.
Abstract: Automatic segmentation of skin lesions is the first step
towards the automated analysis of malignant melanoma. Although
numerous segmentation methods have been developed, few studies
have focused on determining the most effective color space for
melanoma application. This paper proposes an automatic segmentation
algorithm based on color space analysis and clustering-based histogram
thresholding, a process which is able to determine the optimal
color channel for detecting the borders in dermoscopy images. The
algorithm is tested on a set of 30 high resolution dermoscopy images.
A comprehensive evaluation of the results is provided, where borders
manually drawn by four dermatologists, are compared to automated
borders detected by the proposed algorithm, applying three previously
used metrics of accuracy, sensitivity, and specificity and a new metric
of similarity. By performing ROC analysis and ranking the metrics,
it is demonstrated that the best results are obtained with the X and
XoYoR color channels, resulting in an accuracy of approximately
97%. The proposed method is also compared with two state-of-theart
skin lesion segmentation methods.
Abstract: In this paper a novel approach for generalized image
retrieval based on semantic contents is presented. A combination of
three feature extraction methods namely color, texture, and edge
histogram descriptor. There is a provision to add new features in
future for better retrieval efficiency. Any combination of these
methods, which is more appropriate for the application, can be used
for retrieval. This is provided through User Interface (UI) in the
form of relevance feedback. The image properties analyzed in this
work are by using computer vision and image processing algorithms.
For color the histogram of images are computed, for texture cooccurrence
matrix based entropy, energy, etc, are calculated and for
edge density it is Edge Histogram Descriptor (EHD) that is found.
For retrieval of images, a novel idea is developed based on greedy
strategy to reduce the computational complexity. The entire system
was developed using AForge.Imaging (an open source product),
MATLAB .NET Builder, C#, and Oracle 10g. The system was tested
with Coral Image database containing 1000 natural images and
achieved better results.
Abstract: In this paper two mathematical models for definition of gas accidental escape localization in the gas pipelines are suggested. The first model was created for leak localization in the horizontal branched pipeline and second one for leak detection in inclined section of the main gas pipeline. The algorithm of leak localization in the branched pipeline did not demand on knowledge of corresponding initial hydraulic parameters at entrance and ending points of each sections of pipeline. For detection of the damaged section and then leak localization in this section special functions and equations have been constructed. Some results of calculations for compound pipelines having two, four and five sections are presented. Also a method and formula for the leak localization in the simple inclined section of the main gas pipeline are suggested. Some results of numerical calculations defining localization of gas escape for the inclined pipeline are presented.
Abstract: In this study, a Loop Back Algorithm for component
connected labeling for detecting objects in a digital image is
presented. The approach is using loop back connected component
labeling algorithm that helps the system to distinguish the object
detected according to their label. Deferent than whole window
scanning technique, this technique reduces the searching time for
locating the object by focusing on the suspected object based on
certain features defined. In this study, the approach was also
implemented for a face detection system. Face detection system is
becoming interesting research since there are many devices or
systems that require detecting the face for certain purposes. The input
can be from still image or videos, therefore the sub process of this
system has to be simple, efficient and accurate to give a good result.
Abstract: This paper describes a practical approach to design
and develop a hybrid learning with acceleration feedback control
(HLC) scheme for input tracking and end-point vibration suppression
of flexible manipulator systems. Initially, a collocated proportionalderivative
(PD) control scheme using hub-angle and hub-velocity
feedback is developed for control of rigid-body motion of the system.
This is then extended to incorporate a further hybrid control scheme
of the collocated PD control and iterative learning control with
acceleration feedback using genetic algorithms (GAs) to optimize the
learning parameters. Experimental results of the response of the
manipulator with the control schemes are presented in the time and
frequency domains. The performance of the HLC is assessed in terms
of input tracking, level of vibration reduction at resonance modes and
robustness with various payloads.
Abstract: The similarity comparison of RNA secondary
structures is important in studying the functions of RNAs. In recent
years, most existing tools represent the secondary structures by
tree-based presentation and calculate the similarity by tree alignment
distance. Different to previous approaches, we propose a new method
based on maximum clique detection algorithm to extract the maximum
common structural elements in compared RNA secondary structures.
A new graph-based similarity measurement and maximum common
subgraph detection procedures for comparing purely RNA secondary
structures is introduced. Given two RNA secondary structures, the
proposed algorithm consists of a process to determine the score of the
structural similarity, followed by comparing vertices labelling, the
labelled edges and the exact degree of each vertex. The proposed
algorithm also consists of a process to extract the common structural
elements between compared secondary structures based on a proposed
maximum clique detection of the problem. This graph-based model
also can work with NC-IUB code to perform the pattern-based
searching. Therefore, it can be used to identify functional RNA motifs
from database or to extract common substructures between complex
RNA secondary structures. We have proved the performance of this
proposed algorithm by experimental results. It provides a new idea of
comparing RNA secondary structures. This tool is helpful to those
who are interested in structural bioinformatics.
Abstract: Travelling salesman problem (TSP) is a combinational
optimization problem and solution approaches have been applied
many real world problems. Pure TSP assumes the cities to visit are
fixed in time and thus solutions are created to find shortest path
according to these point. But some of the points are canceled to visit
in time. If the problem is not time crucial it is not important to
determine new routing plan but if the points are changing rapidly and
time is necessary do decide a new route plan a new approach should
be applied in such cases. We developed a route plan transfer method
based on transfer learning and we achieved high performance against
determining a new model from scratch in every change.
Abstract: In this paper, we present two new ranking and unranking
algorithms for k-ary trees represented by x-sequences in Gray
code order. These algorithms are based on a gray code generation algorithm
developed by Ahrabian et al.. In mentioned paper, a recursive
backtracking generation algorithm for x-sequences corresponding to
k-ary trees in Gray code was presented. This generation algorithm
is based on Vajnovszki-s algorithm for generating binary trees in
Gray code ordering. Up to our knowledge no ranking and unranking
algorithms were given for x-sequences in this ordering. we present
ranking and unranking algorithms with O(kn2) time complexity for
x-sequences in this Gray code ordering
Abstract: The X-ray technology has been used in non-destructive evaluation in the Power System, in which a visual non-destructive inspection method for the electrical equipment is provided. However, lots of noise is existed in the images that are got from the X-ray digital images equipment. Therefore, the auto defect detection which based on these images will be very difficult to proceed. A theory on X-ray image de-noising algorithm based on wavelet transform is proposed in this paper. Then the edge detection algorithm is used so that the defect can be pushed out. The result of experiment shows that the method which utilized by this paper is very useful for de-noising on the X-ray images.
Abstract: The electromagnetic spectrum is a natural resource
and hence well-organized usage of the limited natural resources is the
necessities for better communication. The present static frequency
allocation schemes cannot accommodate demands of the rapidly
increasing number of higher data rate services. Therefore, dynamic
usage of the spectrum must be distinguished from the static usage to
increase the availability of frequency spectrum. Cognitive radio is not
a single piece of apparatus but it is a technology that can incorporate
components spread across a network. It offers great promise for
improving system efficiency, spectrum utilization, more effective
applications, reduction in interference and reduced complexity of
usage for users. Cognitive radio is aware of its environmental,
internal state, and location, and autonomously adjusts its operations
to achieve designed objectives. It first senses its spectral environment
over a wide frequency band, and then adapts the parameters to
maximize spectrum efficiency with high performance. This paper
only focuses on the analysis of Bit-Error-Rate in cognitive radio by
using Particle Swarm Optimization Algorithm. It is theoretically as
well as practically analyzed and interpreted in the sense of
advantages and drawbacks and how BER affects the efficiency and
performance of the communication system.
Abstract: The vertex connectivity of a graph is the smallest number of vertices whose deletion separates the graph or makes it trivial. This work is devoted to the problem of vertex connectivity test of graphs in a distributed environment based on a general and a constructive approach. The contribution of this paper is threefold. First, using a preconstructed spanning tree of the considered graph, we present a protocol to test whether a given graph is 2-connected using only local knowledge. Second, we present an encoding of this protocol using graph relabeling systems. The last contribution is the implementation of this protocol in the message passing model. For a given graph G, where M is the number of its edges, N the number of its nodes and Δ is its degree, our algorithms need the following requirements: The first one uses O(Δ×N2) steps and O(Δ×logΔ) bits per node. The second one uses O(Δ×N2) messages, O(N2) time and O(Δ × logΔ) bits per node. Furthermore, the studied network is semi-anonymous: Only the root of the pre-constructed spanning tree needs to be identified.
Abstract: Prospective readers can quickly determine whether a document is relevant to their information need if the significant phrases (or keyphrases) in this document are provided. Although keyphrases are useful, not many documents have keyphrases assigned to them, and manually assigning keyphrases to existing documents is costly. Therefore, there is a need for automatic keyphrase extraction. This paper introduces a new domain independent keyphrase extraction algorithm. The algorithm approaches the problem of keyphrase extraction as a classification task, and uses a combination of statistical and computational linguistics techniques, a new set of attributes, and a new machine learning method to distinguish keyphrases from non-keyphrases. The experiments indicate that this algorithm performs better than other keyphrase extraction tools and that it significantly outperforms Microsoft Word 2000-s AutoSummarize feature. The domain independence of this algorithm has also been confirmed in our experiments.
Abstract: In this paper the authors propose a protocol, which uses Elliptic Curve Cryptography (ECC) based on the ElGamal-s algorithm, for sending small amounts of data via an authentication server. The innovation of this approach is that there is no need for a symmetric algorithm or a safe communication channel such as SSL. The reason that ECC has been chosen instead of RSA is that it provides a methodology for obtaining high-speed implementations of authentication protocols and encrypted mail techniques while using fewer bits for the keys. This means that ECC systems require smaller chip size and less power consumption. The proposed protocol has been implemented in Java to analyse its features and vulnerabilities in the real world.
Abstract: In automatic manufacturing and assembling of mechanical, electrical and electronic parts one needs to reliably identify the position of components and to extract the information of these components. Data Matrix Codes (DMC) are established by these days in many areas of industrial manufacturing thanks to their concentration of information on small spaces. In today’s usually order-related industry, where increased tracing requirements prevail, they offer further advantages over other identification systems. This underlines in an impressive way the necessity of a robust code reading system for detecting DMC on the components in factories. This paper compares two methods for estimating the angle of orientation of Data Matrix Codes: one method based on the Hough Transform and the other based on the Mean Shift Algorithm. We concentrate on Data Matrix Codes in industrial environment, punched, milled, lasered or etched on different materials in arbitrary orientation.
Abstract: A direct search approach to determine optimal reservoir operating is proposed with ant colony optimization for continuous domains (ACOR). The model is applied to a system of single reservoir to determine the optimum releases during 42 years of monthly steps. A disadvantage of ant colony based methods and the ACOR in particular, refers to great amount of computer run time consumption. In this study a highly effective procedure for decreasing run time has been developed. The results are compared to those of a GA based model.
Abstract: This paper presents Simulated Annealing based
approach to estimate solar cell model parameters. Single diode solar
cell model is used in this study to validate the proposed approach
outcomes. The developed technique is used to estimate different
model parameters such as generated photocurrent, saturation current,
series resistance, shunt resistance, and ideality factor that govern the
current-voltage relationship of a solar cell. A practical case study is
used to test and verify the consistency of accurately estimating
various parameters of single diode solar cell model. Comparative
study among different parameter estimation techniques is presented
to show the effectiveness of the developed approach.
Abstract: This paper introduces a measure of similarity between
two clusterings of the same dataset produced by two different
algorithms, or even the same algorithm (K-means, for instance, with
different initializations usually produce different results in clustering
the same dataset). We then apply the measure to calculate the
similarity between pairs of clusterings, with special interest directed
at comparing the similarity between various machine clusterings and
human clustering of datasets. The similarity measure thus can be used
to identify the best (in terms of most similar to human) clustering
algorithm for a specific problem at hand. Experimental results
pertaining to the text categorization problem of a Portuguese corpus
(wherein a translation-into-English approach is used) are presented, as well as results on the well-known benchmark IRIS dataset. The
significance and other potential applications of the proposed measure
are discussed.