Abstract: The goal of this work is to improve the efficiency and the reliability of the automatic artifact rejection, in particular from the Electroencephalographic (EEG) recordings. Artifact rejection is a key topic in signal processing. The artifacts are unwelcome signals that may occur during the signal acquisition and that may alter the analysis of the signals themselves. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we enhance this technique introducing the Renyi-s entropy. The performance of our method was tested exploiting the Independent Component scalp maps and it was compared to the performance of the method in literature and it showed to outperform it.
Abstract: We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Abstract: This paper aims to perform the second law analysis of
thermodynamics on the laminar film condensation of pure saturated
vapor flowing in the direction of gravity on an ellipsoid with variable
wall temperature. The analysis provides us understanding how the
geometric parameter- ellipticity and non-isothermal wall temperature
variation amplitude “A." affect entropy generation during film-wise
condensation heat transfer process. To understand of which
irreversibility involved in this condensation process, we derived an
expression for the entropy generation number in terms of ellipticity
and A. The result indicates that entropy generation increases with
ellipticity. Furthermore, the irreversibility due to finite temperature
difference heat transfer dominates over that due to condensate film
flow friction and the local entropy generation rate decreases with
increasing A in the upper half of ellipsoid. Meanwhile, the local
entropy generation rate enhances with A around the rear lower half of
ellipsoid.
Abstract: We consider the topological entropy of maps that in
general, cannot be described by one-dimensional dynamics. In particular,
we show that for a multivalued map F generated by singlevalued
maps, the topological entropy of any of the single-value map bounds the topological entropy of F from below.
Abstract: Distance protection of transmission lines including advanced flexible AC transmission system (FACTS) devices has been a very challenging task. FACTS devices of interest in this paper are static synchronous series compensators (SSSC) and unified power flow controller (UPFC). In this paper, a new algorithm is proposed to detect and classify the fault and identify the fault position in a transmission line with respect to a FACTS device placed in the midpoint of the transmission line. Discrete wavelet transformation and wavelet entropy calculations are used to analyze during fault current and voltage signals of the compensated transmission line. The proposed algorithm is very simple and accurate in fault detection and classification. A variety of fault cases and simulation results are introduced to show the effectiveness of such algorithm.
Abstract: Information theory and Statistics play an important role in Biological Sciences when we use information measures for the study of diversity and equitability. In this communication, we develop the link among the three disciplines and prove that sampling distributions can be used to develop new information measures. Our study will be an interdisciplinary and will find its applications in Biological systems.
Abstract: It is sometimes difficult to differentiate between
innocent murmurs and pathological murmurs during auscultation. In
these difficult cases, an intelligent stethoscope with decision support
abilities would be of great value. In this study, using a dog model,
phonocardiographic recordings were obtained from 27 boxer dogs
with various degrees of aortic stenosis (AS) severity. As a reference
for severity assessment, continuous wave Doppler was used. The data
were analyzed with recurrence quantification analysis (RQA) with
the aim to find features able to distinguish innocent murmurs from
murmurs caused by AS. Four out of eight investigated RQA features
showed significant differences between innocent murmurs and
pathological murmurs. Using a plain linear discriminant analysis
classifier, the best pair of features (recurrence rate and entropy)
resulted in a sensitivity of 90% and a specificity of 88%. In
conclusion, RQA provide valid features which can be used for
differentiation between innocent murmurs and murmurs caused by
AS.
Abstract: In the present communication, we have proposed
some new generalized measure of fuzzy entropy based upon real
parameters, discussed their and desirable properties, and presented
these measures graphically. An important property, that is,
monotonicity of the proposed measures has also been studied.
Abstract: Adsorption of Toluidine blue dye from aqueous solutions onto Neem Leaf Powder (NLP) has been investigated. The surface characterization of this natural material was examined by Particle size analysis, Scanning Electron Microscopy (SEM), Fourier Transform Infrared (FTIR) spectroscopy and X-Ray Diffraction (XRD). The effects of process parameters such as initial concentration, pH, temperature and contact duration on the adsorption capacities have been evaluated, in which pH has been found to be most effective parameter among all. The data were analyzed using the Langmuir and Freundlich for explaining the equilibrium characteristics of adsorption. And kinetic models like pseudo first- order, second-order model and Elovich equation were utilized to describe the kinetic data. The experimental data were well fitted with Langmuir adsorption isotherm model and pseudo second order kinetic model. The thermodynamic parameters, such as Free energy of adsorption (AG"), enthalpy change (AH') and entropy change (AS°) were also determined and evaluated.
Abstract: In this paper, a robust digital image watermarking
scheme for copyright protection applications using the singular value
decomposition (SVD) is proposed. In this scheme, an entropy
masking model has been applied on the host image for the texture
segmentation. Moreover, the local luminance and textures of the host
image are considered for watermark embedding procedure to
increase the robustness of the watermarking scheme. In contrast to all
existing SVD-based watermarking systems that have been designed
to embed visual watermarks, our system uses a pseudo-random
sequence as a watermark. We have tested the performance of our
method using a wide variety of image processing attacks on different
test images. A comparison is made between the results of our
proposed algorithm with those of a wavelet-based method to
demonstrate the superior performance of our algorithm.
Abstract: We study in this paper the effect of the scene
changing on image sequences coding system using Embedded
Zerotree Wavelet (EZW). The scene changing considered here is the
full motion which may occurs. A special image sequence is generated
where the scene changing occurs randomly. Two scenarios are
considered: In the first scenario, the system must provide the
reconstruction quality as best as possible by the management of the
bit rate (BR) while the scene changing occurs. In the second scenario,
the system must keep the bit rate as constant as possible by the
management of the reconstruction quality. The first scenario may be
motivated by the availability of a large band pass transmission
channel where an increase of the bit rate may be possible to keep the
reconstruction quality up to a given threshold. The second scenario
may be concerned by the narrow band pass transmission channel
where an increase of the bit rate is not possible. In this last case,
applications for which the reconstruction quality is not a constraint
may be considered. The simulations are performed with five scales
wavelet decomposition using the 9/7-tap filter bank biorthogonal
wavelet. The entropy coding is performed using a specific defined
binary code book and EZW algorithm. Experimental results are
presented and compared to LEAD H263 EVAL. It is shown that if
the reconstruction quality is the constraint, the system increases the
bit rate to obtain the required quality. In the case where the bit rate
must be constant, the system is unable to provide the required quality
if the scene change occurs; however, the system is able to improve
the quality while the scene changing disappears.
Abstract: In today's day and age, one of the important topics in
information security is authentication. There are several alternatives
to text-based authentication of which includes Graphical Password
(GP) or Graphical User Authentication (GUA). These methods stems
from the fact that humans recognized and remembers images better
than alphanumerical text characters. This paper will focus on the
security aspect of GP algorithms and what most researchers have
been working on trying to define these security features and
attributes. The goal of this study is to develop a fuzzy decision model
that allows automatic selection of available GP algorithms by taking
into considerations the subjective judgments of the decision makers
who are more than 50 postgraduate students of computer science. The
approach that is being proposed is based on the Fuzzy Analytic
Hierarchy Process (FAHP) which determines the criteria weight as a
linear formula.
Abstract: In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.
Abstract: The linear methods of heart rate variability analysis
such as non-parametric (e.g. fast Fourier transform analysis) and
parametric methods (e.g. autoregressive modeling) has become an
established non-invasive tool for marking the cardiac health, but their
sensitivity and specificity were found to be lower than expected with
positive predictive value
Abstract: In this paper, a fast motion compensation algorithm is
proposed that improves coding efficiency for video sequences with
brightness variations. We also propose a cross entropy measure
between histograms of two frames to detect brightness variations. The
framewise brightness variation parameters, a multiplier and an offset
field for image intensity, are estimated and compensated. Simulation
results show that the proposed method yields a higher peak signal to
noise ratio (PSNR) compared with the conventional method, with a
greatly reduced computational load, when the video scene contains
illumination changes.
Abstract: In this paper, a two factor scheme is proposed to
generate cryptographic keys directly from biometric data, which
unlike passwords, are strongly bound to the user. Hash value of the
reference iris code is used as a cryptographic key and its length
depends only on the hash function, being independent of any other
parameter. The entropy of such keys is 94 bits, which is much higher
than any other comparable system. The most important and distinct
feature of this scheme is that it regenerates the reference iris code by
providing a genuine iris sample and the correct user password. Since
iris codes obtained from two images of the same eye are not exactly
the same, error correcting codes (Hadamard code and Reed-Solomon
code) are used to deal with the variability. The scheme proposed here
can be used to provide keys for a cryptographic system and/or for
user authentication. The performance of this system is evaluated on
two publicly available databases for iris biometrics namely CBS and
ICE databases. The operating point of the system (values of False
Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set
by properly selecting the error correction capacity (ts) of the Reed-
Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096%
and FRR is 0.76%.
Abstract: Partial discharge (PD) detection is an important
method to evaluate the insulation condition of metal-clad apparatus.
Non-intrusive sensors which are easy to install and have no
interruptions on operation are preferred in onsite PD detection.
However, it often lacks of accuracy due to the interferences in PD
signals. In this paper a novel PD extraction method that uses frequency
analysis and entropy based time-frequency (TF) analysis is introduced.
The repetitive pulses from convertor are first removed via frequency
analysis. Then, the relative entropy and relative peak-frequency of
each pulse (i.e. time-indexed vector TF spectrum) are calculated and
all pulses with similar parameters are grouped. According to the
characteristics of non-intrusive sensor and the frequency distribution
of PDs, the pulses of PD and interferences are separated. Finally the
PD signal and interferences are recovered via inverse TF transform.
The de-noised result of noisy PD data demonstrates that the
combination of frequency and time-frequency techniques can
discriminate PDs from interferences with various frequency
distributions.
Abstract: This paper examines the implementation of RC5 block cipher for digital images along with its detailed security analysis. A complete specification for the method of application of the RC5 block cipher to digital images is given. The security analysis of RC5 block cipher for digital images against entropy attack, bruteforce, statistical, and differential attacks is explored from strict cryptographic viewpoint. Experiments and results verify and prove that RC5 block cipher is highly secure for real-time image encryption from cryptographic viewpoint. Thorough experimental tests are carried out with detailed analysis, demonstrating the high security of RC5 block cipher algorithm.
Abstract: In this paper, a method for decision making in fuzzy environment is presented.A new subjective and objective integrated approach is introduced that used to assign weight attributes in fuzzy multiple attribute decision making (FMADM) problems and alternatives and fmally ranked by proposed method.
Abstract: Data compression is used operationally to reduce bandwidth and storage requirements. An efficient method for achieving lossless weather radar data compression is presented. The characteristics of the data are taken into account and the optical linear prediction is used for the PPI images in the weather radar data in the proposed method. The next PPI image is identical to the current one and a dramatic reduction in source entropy is achieved by using the prediction algorithm. Some lossless compression methods are used to compress the predicted data. Experimental results show that for the weather radar data, the method proposed in this paper outperforms the other methods.