Abstract: The ability to detect and classify the type of fault
plays a great role in the protection of power system. This procedure
is required to be precise with no time consumption. In this paper
detection of fault type has been implemented using wavelet analysis
together with wavelet entropy principle. The simulation of power
system is carried out using PSCAD/EMTDC. Different types of
faults were studied obtaining various current waveforms. These
current waveforms were decomposed using wavelet analysis into
different approximation and details. The wavelet entropy of such
decompositions is analyzed reaching a successful methodology for
fault classification. The suggested approach is tested using different
fault types and proven successful identification for the type of fault.
Abstract: The river flow forecasting represents a crucial point to employ for improving a management policy addressed to the right use of water resources as well as for conjugating prevention and defense actions against environmental degradation. The difficulties occurring during the field activities encourage the development and implementation of operative computation and measuring methods addressed to time reduction for data acquisition and processing maintaining a good level of accuracy. Therefore, the aim of the present work is to test a new entropy based expeditive methodology for the evaluation of the rating curves on three gauged sections with different geometric and morphological characteristics. The methodology requires the choice of only three verticals along the measure section and the sampling of only the maximum velocity. The results underline how in most conditions the rating curves drawn can replace those built with classic methodologies, simplifying thus the procedures of data monitoring and calculation.
Abstract: A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Abstract: The aim of this contribution is to present a new
approach in modeling the electrical activity of the human heart. A
recurrent artificial neural network is being used in order to exhibit a
subset of the dynamics of the electrical behavior of the human heart.
The proposed model can also be used, when integrated, as a
diagnostic tool of the human heart system.
What makes this approach unique is the fact that every model is
being developed from physiological measurements of an individual.
This kind of approach is very difficult to apply successfully in many
modeling problems, because of the complexity and entropy of the
free variables describing the complex system. Differences between
the modeled variables and the variables of an individual, measured at
specific moments, can be used for diagnostic purposes. The sensor
fusion used in order to optimize the utilization of biomedical sensors
is another point that this paper focuses on. Sensor fusion has been
known for its advantages in applications such as control and
diagnostics of mechanical and chemical processes.
Abstract: The objective of this paper, is to apply support vector machine (SVM) approach for the classification of cancerous and normal regions of prostate images. Three kinds of textural features are extracted and used for the analysis: parameters of the Gauss- Markov random field (GMRF), correlation function and relative entropy. Prostate images are acquired by the system consisting of a microscope, video camera and a digitizing board. Cross-validated classification over a database of 46 images is implemented to evaluate the performance. In SVM classification, sensitivity and specificity of 96.2% and 97.0% are achieved for the 32x32 pixel block sized data, respectively, with an overall accuracy of 96.6%. Classification performance is compared with artificial neural network and k-nearest neighbor classifiers. Experimental results demonstrate that the SVM approach gives the best performance.
Abstract: Methods of clustering which were developed in the
data mining theory can be successfully applied to the investigation of
different kinds of dependencies between the conditions of
environment and human activities. It is known, that environmental
parameters such as temperature, relative humidity, atmospheric
pressure and illumination have significant effects on the human
mental performance. To investigate these parameters effect, data
mining technique of clustering using entropy and Information Gain
Ratio (IGR) K(Y/X) = (H(X)–H(Y/X))/H(Y) is used, where
H(Y)=-ΣPi ln(Pi). This technique allows adjusting the boundaries of
clusters. It is shown that the information gain ratio (IGR) grows
monotonically and simultaneously with degree of connectivity
between two variables. This approach has some preferences if
compared, for example, with correlation analysis due to relatively
smaller sensitivity to shape of functional dependencies. Variant of an
algorithm to implement the proposed method with some analysis of
above problem of environmental effects is also presented. It was
shown that proposed method converges with finite number of steps.
Abstract: Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system - the central processing unit (CPU) of computers. The quantity of organization for several generations of CPUs shows a double exponential rate of change of organization with time. The exact functional dependence has a fine, S-shaped structure, revealing some of the mechanisms of self-organization. The principle of least action helps to explain the mechanism of increase of organization through quantity accumulation and constraint and curvature minimization with an attractor, the least average sum of actions of all elements and for all motions. This approach can help describe, quantify, measure, manage, design and predict future behavior of complex systems to achieve the highest rates of self organization to improve their quality. It can be applied to other complex systems from Physics, Chemistry, Biology, Ecology, Economics, Cities, network theory and others where complex systems are present.
Abstract: Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.
Abstract: This paper proposes to use ETM+ multispectral data
and panchromatic band as well as texture features derived from the
panchromatic band for land cover classification. Four texture features
including one 'internal texture' and three GLCM based textures
namely correlation, entropy, and inverse different moment were used
in combination with ETM+ multispectral data. Two data sets
involving combination of multispectral, panchromatic band and its
texture were used and results were compared with those obtained by
using multispectral data alone. A decision tree classifier with and
without boosting were used to classify different datasets. Results
from this study suggest that the dataset consisting of panchromatic
band, four of its texture features and multispectral data was able to
increase the classification accuracy by about 2%. In comparison, a
boosted decision tree was able to increase the classification accuracy
by about 3% with the same dataset.
Abstract: As the use of registration packages spreads, the number of the aligned image pairs in image databases (either by manual or automatic methods) increases dramatically. These image pairs can serve as a set of training data. Correspondingly, the images that are to be registered serve as testing data. In this paper, a novel medical image registration method is proposed which is based on the a priori knowledge of the expected joint intensity distribution estimated from pre-aligned training images. The goal of the registration is to find the optimal transformation such that the distance between the observed joint intensity distribution obtained from the testing image pair and the expected joint intensity distribution obtained from the corresponding training image pair is minimized. The distance is measured using the divergence measure based on Tsallis entropy. Experimental results show that, compared with the widely-used Shannon mutual information as well as Tsallis mutual information, the proposed method is computationally more efficient without sacrificing registration accuracy.
Abstract: Authentication plays a vital role in many secure
systems. Most of these systems require user to log in with his or her
secret password or pass phrase before entering it. This is to ensure all
the valuables information is kept confidential guaranteeing also its
integrity and availability. However, to achieve this goal, users are
required to memorize high entropy passwords or pass phrases.
Unfortunately, this sometimes causes difficulty for user to remember
meaningless strings of data. This paper presents a new scheme which
assigns a weight to each personal question given to the user in
revealing the encrypted secrets or password. Concentration of this
scheme is to offer fault tolerance to users by allowing them to forget
the specific password to a subset of questions and still recover the
secret and achieve successful authentication. Comparison on level of
security for weight-based and weightless secret recovery scheme is
also discussed. The paper concludes with the few areas that requires
more investigation in this research.
Abstract: This paper investigates the indices of a creative city in
Isfahan. Its main aim is to evaluate quantitative status of the creative
city indices in Isfahan city, analyze the dispersion and distribution of
these indices in Isfahan city. Concerning these, this study tries to
analyze the creative city indices in fifteen area of Isfahan through
secondary data, questionnaire, TOPSIS model, Shannon entropy and
SPSS. Based on this, the fifteen areas of Isfahan city have been
ranked with 12 factors of creative city indices. The results of studies
show that fifteen areas of Isfahan city are not equally benefiting from
creative indices and there is much difference between the areas of
Isfahan city.
Abstract: It is established that the instantaneous heart rate (HR) of healthy humans keeps on changing. Analysis of heart rate variability (HRV) has become a popular non invasive tool for assessing the activities of autonomic nervous system. Depressed HRV has been found in several disorders, like diabetes mellitus (DM) and coronary artery disease, characterised by autonomic nervous dysfunction. A new technique, which searches for pattern repeatability in a time series, is proposed specifically for the analysis of heart rate data. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are compared with approximate entropy and sample entropy. In our analysis, based on the method developed, it is observed that heart rate variability is significantly different for DM patients, particularly for patients with diabetic foot ulcer.
Abstract: Disposal of health-care waste (HCW) is considered as
an important environmental problem especially in large cities.
Multiple criteria decision making (MCDM) techniques are apt to deal
with quantitative and qualitative considerations of the health-care
waste management (HCWM) problems. This research proposes a
fuzzy multi-criteria group decision making approach with a multilevel
hierarchical structure including qualitative as well as
quantitative performance attributes for evaluating HCW disposal
alternatives for Istanbul. Using the entropy weighting method,
objective weights as well as subjective weights are taken into account
to determine the importance weighting of quantitative performance
attributes. The results obtained using the proposed methodology are
thoroughly analyzed.
Abstract: Among all mechanical joining processes, welding has
been employed for its advantage in design flexibility, cost saving,
reduced overall weight and enhanced structural performance.
However, for structures made of relatively thin components, welding
can introduce significant buckling distortion which causes loss of
dimensional control, structural integrity and increased fabrication
costs. Different parameters can affect buckling behavior of welded
thin structures such as, heat input, welding sequence, dimension of
structure. In this work, a 3-D thermo elastic-viscoplastic finite
element analysis technique is applied to evaluate the effect of shell
dimensions on buckling behavior and entropy generation of welded
thin shells. Also, in the present work, the approximated longitudinal
transient stresses which produced in each time step, is applied to the
3D-eigenvalue analysis to ratify predicted buckling time and
corresponding eigenmode. Besides, the possibility of buckling
prediction by entropy generation at each time is investigated and it is
found that one can predict time of buckling with drawing entropy
generation versus out of plane deformation. The results of finite
element analysis show that the length, span and thickness of welded
thin shells affect the number of local buckling, mode shape of global
buckling and post-buckling behavior of welded thin shells.
Abstract: Flexible macroblock ordering (FMO), adopted in the
H.264 standard, allows to partition all macroblocks (MBs) in a frame
into separate groups of MBs called Slice Groups (SGs). FMO can not
only support error-resilience, but also control the size of video packets
for different network types. However, it is well-known that the number
of bits required for encoding the frame is increased by adopting FMO.
In this paper, we propose a novel algorithm that can reduce the bitrate
overhead caused by utilizing FMO. In the proposed algorithm, all MBs
are grouped in SGs based on the similarity of the transform
coefficients. Experimental results show that our algorithm can reduce
the bitrate as compared with conventional FMO.
Abstract: CTMA-bentonite and BTEA-Bentonite prepared by Na-bentonite cation exchanged with cetyltrimethylammonium(CTMA) and benzyltriethylammonium (BTEA). Products were characterized by XRD and IR techniques.The d001 spacing value of CTMA-bentonite and BTEA-bentonite are 7.54Å and 3.50Å larger than that of Na-bentonite at 100% cation exchange capacity, respectively. The IR spectrum showed that the intensities of OH stretching and bending vibrations of the two organoclays decreased greatly comparing to untreated Na-bentonite. Batch experiments were carried out at 303 K, 318 K and 333 K to obtain the sorption isotherms of Crystal violet onto the two organoclays. The results show that the sorption isothermal data could be well described by Freundlich model. The dynamical data for the two organoclays fit well with pseudo-second-order kinetic model. The adsorption capacity of CTMA-bentonite was found higher than that of BTEA-Bentonite. Thermodynamic parameters such as changes in the free energy (ΔG°), the enthalpy (ΔH°) and the entropy (ΔS°) were also evaluated. The overall adsorption process of Crystal violet onto the two organoclays were spontaneous, endothermic physisorption. The CTMA-bentonite and BTEA-Bentonite could be employed as low-cost alternatives to activated carbon in wastewater treatment for the removal of color which comes from textile dyes.
Abstract: A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.
Abstract: This paper describes the design and results of FROID,
an outbound intrusion detection system built with agent technology
and supported by an attacker-centric ontology. The prototype
features a misuse-based detection mechanism that identifies remote
attack tools in execution. Misuse signatures composed of attributes
selected through entropy analysis of outgoing traffic streams and
process runtime data are derived from execution variants of attack
programs. The core of the architecture is a mesh of self-contained
detection cells organized non-hierarchically that group agents in a
functional fashion. The experiments show performance gains when
the ontology is enabled as well as an increase in accuracy achieved
when correlation cells combine detection evidence received from
independent detection cells.
Abstract: We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.