Abstract: Biclustering aims at identifying several biclusters that
reveal potential local patterns from a microarray matrix. A bicluster is
a sub-matrix of the microarray consisting of only a subset of genes
co-regulates in a subset of conditions. In this study, we extend the
motif of subspace clustering to present a K-biclusters clustering (KBC)
algorithm for the microarray biclustering issue. Besides minimizing
the dissimilarities between genes and bicluster centers within all
biclusters, the objective function of the KBC algorithm additionally
takes into account how to minimize the residues within all biclusters
based on the mean square residue model. In addition, the objective
function also maximizes the entropy of conditions to stimulate more
conditions to contribute the identification of biclusters. The KBC
algorithm adopts the K-means type clustering process to efficiently
make the partition of K biclusters be optimized. A set of experiments
on a practical microarray dataset are demonstrated to show the
performance of the proposed KBC algorithm.
Abstract: In this paper a novel approach for generalized image
retrieval based on semantic contents is presented. A combination of
three feature extraction methods namely color, texture, and edge
histogram descriptor. There is a provision to add new features in
future for better retrieval efficiency. Any combination of these
methods, which is more appropriate for the application, can be used
for retrieval. This is provided through User Interface (UI) in the
form of relevance feedback. The image properties analyzed in this
work are by using computer vision and image processing algorithms.
For color the histogram of images are computed, for texture cooccurrence
matrix based entropy, energy, etc, are calculated and for
edge density it is Edge Histogram Descriptor (EHD) that is found.
For retrieval of images, a novel idea is developed based on greedy
strategy to reduce the computational complexity. The entire system
was developed using AForge.Imaging (an open source product),
MATLAB .NET Builder, C#, and Oracle 10g. The system was tested
with Coral Image database containing 1000 natural images and
achieved better results.
Abstract: In the present communication, the existing measures of
fuzzy entropy are reviewed. A generalized parametric exponential
fuzzy entropy is defined.Our study of the four essential and some
other properties of the proposed measure, clearly establishes the
validity of the measure as an entropy.
Abstract: In this paper a novel algorithm is proposed to merit
the accuracy of finger vein recognition. The performances of
Principal Component Analysis (PCA), Kernel Principal Component
Analysis (KPCA), and Kernel Entropy Component Analysis (KECA)
in this algorithm are validated and compared with each other in order
to determine which one is the most appropriate one in terms of finger
vein recognition.
Abstract: A number of competing methodologies have been developed
to identify genes and classify DNA sequences into coding
and non-coding sequences. This classification process is fundamental
in gene finding and gene annotation tools and is one of the most
challenging tasks in bioinformatics and computational biology. An
information theory measure based on mutual information has shown
good accuracy in classifying DNA sequences into coding and noncoding.
In this paper we describe a species independent iterative
approach that distinguishes coding from non-coding sequences using
the mutual information measure (MIM). A set of sixty prokaryotes is
used to extract universal training data. To facilitate comparisons with
the published results of other researchers, a test set of 51 bacterial
and archaeal genomes was used to evaluate MIM. These results
demonstrate that MIM produces superior results while remaining
species independent.
Abstract: Aggregation behavior of sodium salicylate and sodium cumene sulfonate was studied in aqueous solution at different temperature. Specific conductivity and relative viscosity were measured at different temperature to find minimum hydrotropic concentration. The thermodynamic parameters (free energy, enthalpy and entropy) were evaluated in the temperature range of 30°C-70°C. The free energy decreased with increase in temperature. The aggregation was found to be exothermic in nature and favored by positive value of entropy.
Abstract: Droplet size distributions in the cold spray of a fuel
are important in observed combustion behavior. Specification of
droplet size and velocity distributions in the immediate downstream
of injectors is also essential as boundary conditions for advanced
computational fluid dynamics (CFD) and two-phase spray transport
calculations. This paper describes the development of a new model to
be incorporated into maximum entropy principle (MEP) formalism
for prediction of droplet size distribution in droplet formation region.
The MEP approach can predict the most likely droplet size and
velocity distributions under a set of constraints expressing the
available information related to the distribution.
In this article, by considering the mechanisms of turbulence
generation inside the nozzle and wave growth on jet surface, it is
attempted to provide a logical framework coupling the flow inside the
nozzle to the resulting atomization process. The purpose of this paper
is to describe the formulation of this new model and to incorporate it
into the maximum entropy principle (MEP) by coupling sub-models
together using source terms of momentum and energy. Comparison
between the model prediction and experimental data for a gas turbine
swirling nozzle and an annular spray indicate good agreement
between model and experiment.
Abstract: The compression-absorption heat pump (C-A HP), one
of the promising heat recovery equipments that make process hot
water using low temperature heat of wastewater, was evaluated by
computer simulation. A simulation program was developed based on
the continuity and the first and second laws of thermodynamics. Both
the absorber and desorber were modeled using UA-LMTD method. In
order to prevent an unfeasible temperature profile and to reduce
calculation errors from the curved temperature profile of a mixture,
heat loads were divided into lots of segments. A single-stage
compressor was considered. A compressor cooling load was also
taken into account. An isentropic efficiency was computed from the
map data. Simulation conditions were given based on the system
consisting of ordinarily designed components. The simulation results
show that most of the total entropy generation occurs during the
compression and cooling process, thus suggesting the possibility that
system performance can be enhanced if a rectifier is introduced.
Abstract: In the literature of information theory, there is
necessity for comparing the different measures of fuzzy entropy and
this consequently, gives rise to the need for normalizing measures of
fuzzy entropy. In this paper, we have discussed this need and hence
developed some normalized measures of fuzzy entropy. It is also
desirable to maximize entropy and to minimize directed divergence
or distance. Keeping in mind this idea, we have explained the method
of optimizing different measures of fuzzy entropy.
Abstract: A nucleotide sequence can be expressed as a numerical sequence when each nucleotide is assigned its proton number. A resulting gene numerical sequence can be investigated for its fractal dimension in terms of evolution and chemical properties for comparative studies. We have investigated such nucleotide fluctuation in the 16S rRNA gene of archaea thermophiles. The studied archaea thermophiles were archaeoglobus fulgidus, methanothermobacter thermautotrophicus, methanocaldococcus jannaschii, pyrococcus horikoshii, and thermoplasma acidophilum. The studied five archaea-euryarchaeota thermophiles have fractal dimension values ranging from 1.93 to 1.97. Computer simulation shows that random sequences would have an average of about 2 with a standard deviation about 0.015. The fractal dimension was found to correlate (negative correlation) with the thermophile-s optimal growth temperature with R2 value of 0.90 (N =5). The inclusion of two aracheae-crenarchaeota thermophiles reduces the R2 value to 0.66 (N = 7). Further inclusion of two bacterial thermophiles reduces the R2 value to 0.50 (N =9). The fractal dimension is correlated (positive) to the sequence GC content with an R2 value of 0.89 for the five archaea-euryarchaeota thermophiles (and 0.74 for the entire set of N = 9), although computer simulation shows little correlation. The highest correlation (positive) was found to be between the fractal dimension and di-nucleotide Shannon entropy. However Shannon entropy and sequence GC content were observed to correlate with optimal growth temperature having an R2 of 0.8 (negative), and 0.88 (positive), respectively, for the entire set of 9 thermophiles; thus the correlation lacks species specificity. Together with another correlation study of bacterial radiation dosage with RecA repair gene sequence fractal dimension, it is postulated that fractal dimension analysis is a sensitive tool for studying the relationship between genotype and phenotype among closely related sequences.
Abstract: We develop new nonlinear methods of
immunofluorescence analysis for a sensitive technology of
respiratory burst reaction of DNA fluorescence due to oxidative
activity in the peripheral blood neutrophils. Histograms in flow
cytometry experiments represent a fluorescence flashes frequency as
functions of fluorescence intensity. We used the Shannon-Weaver
index for definition of neutrophils- biodiversity and Hurst index for
definition of fractal-s correlations in immunofluorescence for
different donors, as the basic quantitative criteria for medical
diagnostics of health status. We analyze frequencies of flashes,
information, Shannon entropies and their fractals in
immunofluorescence networks due to reduction of histogram range.
We found the number of simplest universal correlations for
biodiversity, information and Hurst index in diagnostics and
classification of pathologies for wide spectra of diseases. In addition
is determined the clear criterion of a common immunity and human
health status in a form of yes/no answers type. These answers based
on peculiarities of information in immunofluorescence networks and
biodiversity of neutrophils. Experimental data analysis has shown the
existence of homeostasis for information entropy in oxidative activity
of DNA in neutrophil nuclei for all donors.
Abstract: The paper provides a numerical investigation of the
entropy generation analysis due to natural convection in an inclined
square porous cavity. The coupled equations of mass, momentum,
energy and species conservation are solved using the Control Volume
Finite-Element Method. Effect of medium permeability and
inclination angle on entropy generation is analysed. It was found that
according to the Darcy number and the porous thermal Raleigh
number values, the entropy generation could be mainly due to heat
transfer or to fluid friction irreversibility and that entropy generation
reaches extremum values for specific inclination angles.
Abstract: This paper describes Independent Component Analysis (ICA) based fixed-point algorithm for the blind separation of the convolutive mixture of speech, picked-up by a linear microphone array. The proposed algorithm extracts independent sources by non- Gaussianizing the Time-Frequency Series of Speech (TFSS) in a deflationary way. The degree of non-Gaussianization is measured by negentropy. The relative performances of algorithm under random initialization and Null beamformer (NBF) based initialization are studied. It has been found that an NBF based initial value gives speedy convergence as well as better separation performance
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Abstract: Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper
Abstract: In this work a new method for low complexity
image coding is presented, that permits different settings and great
scalability in the generation of the final bit stream. This coding
presents a continuous-tone still image compression system that
groups loss and lossless compression making use of finite arithmetic
reversible transforms. Both transformation in the space of color and
wavelet transformation are reversible. The transformed coefficients
are coded by means of a coding system in depending on a
subdivision into smaller components (CFDS) similar to the bit
importance codification. The subcomponents so obtained are
reordered by means of a highly configure alignment system
depending on the application that makes possible the re-configure of
the elements of the image and obtaining different importance levels
from which the bit stream will be generated. The subcomponents of
each importance level are coded using a variable length entropy
coding system (VBLm) that permits the generation of an embedded
bit stream. This bit stream supposes itself a bit stream that codes a
compressed still image. However, the use of a packing system on the
bit stream after the VBLm allows the realization of a final highly
scalable bit stream from a basic image level and one or several
improvement levels.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: The design of weight is one of the important parts in
fuzzy decision making, as it would have a deep effect on the evaluation
results. Entropy is one of the weight measure based on objective
evaluation. Non--probabilistic-type entropy measures for fuzzy set
and interval type-2 fuzzy sets (IT2FS) have been developed and applied
to weight measure. Since the entropy for (IT2FS) for decision
making yet to be explored, this paper proposes a new objective
weight method by using entropy weight method for multiple attribute
decision making (MADM). This paper utilizes the nature of IT2FS
concept in the evaluation process to assess the attribute weight based
on the credibility of data. An example was presented to demonstrate
the feasibility of the new method in decision making. The entropy
measure of interval type-2 fuzzy sets yield flexible judgment and
could be applied in decision making environment.
Abstract: Considering payload, reliability, security and operational lifetime as major constraints in transmission of images we put forward in this paper a steganographic technique implemented at the physical layer. We suggest transmission of Halftoned images (payload constraint) in wireless sensor networks to reduce the amount of transmitted data. For low power and interference limited applications Turbo codes provide suitable reliability. Ensuring security is one of the highest priorities in many sensor networks. The Turbo Code structure apart from providing forward error correction can be utilized to provide for encryption. We first consider the Halftoned image and then the method of embedding a block of data (called secret) in this Halftoned image during the turbo encoding process is presented. The small modifications required at the turbo decoder end to extract the embedded data are presented next. The implementation complexity and the degradation of the BER (bit error rate) in the Turbo based stego system are analyzed. Using some of the entropy based crypt analytic techniques we show that the strength of our Turbo based stego system approaches that found in the OTPs (one time pad).