Abstract: Probabilistic measures of uncertainty have been
obtained as functions of time and birth and death rates in a queuing
process. The variation of different entropy measures has been studied
in steady and non-steady processes of queuing theory.
Abstract: In this paper a polymer electrolyte membrane (PEM)
fuel cell power system including burner, steam reformer, heat
exchanger and water heater has been considered to meet the
electrical, heating, cooling and domestic hot water loads of
residential building which in Tehran. The system uses natural gas as
fuel and works in CHP mode. Design and operating conditions of a
PEM fuel cell system is considered in this study. The energy
requirements of residential building and the number of fuel cell
stacks to meet them have been estimated. The method involved
exergy analysis and entropy generation thorough the months of the
year. Results show that all the energy needs of the building can be
met with 12 fuel cell stacks at a nominal capacity of 8.5 kW. Exergy
analysis of the CHP system shows that the increase in the ambient air
temperature from 1oC to 40oC, will have an increase of entropy
generation by 5.73%.Maximum entropy generates for 15 hour in 15th
of June and 15th of July is estimated to amount at 12624 (kW/K).
Entropy generation of this system through a year is estimated to
amount to 1004.54 GJ/k.year.
Abstract: Using entropy weight and TOPSIS method, a
comprehensive evaluation is done on the development level of
Chinese regional service industry in this paper. Firstly, based on
existing research results, an evaluation index system is constructed
from the scale of development, the industrial structure and the
economic benefits. An evaluation model is then built up based on
entropy weight and TOPSIS, and an empirical analysis is conducted on
the development level of service industries in 31 Chinese provinces
during 2006 and 2009 from the two dimensions or time series and
cross section, which provides new idea for assessing regional service
industry. Furthermore, the 31 provinces are classified into four
categories based on the evaluation results, and deep analysis is carried
out on the evaluation results.
Abstract: Because of importance of energy, optimization of
power generation systems is necessary. Gas turbine cycles are
suitable manner for fast power generation, but their efficiency is
partly low. In order to achieving higher efficiencies, some
propositions are preferred such as recovery of heat from exhaust
gases in a regenerator, utilization of intercooler in a multistage
compressor, steam injection to combustion chamber and etc.
However thermodynamic optimization of gas turbine cycle, even
with above components, is necessary. In this article multi-objective
genetic algorithms are employed for Pareto approach optimization of
Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective
optimization a number of conflicting objective functions
are to be optimized simultaneously. The important objective
functions that have been considered for optimization are entropy
generation of RIGT cycle (Ns) derives using Exergy Analysis and
Gouy-Stodola theorem, thermal efficiency and the net output power
of RIGT Cycle. These objectives are usually conflicting with each
other. The design variables consist of thermodynamic parameters
such as compressor pressure ratio (Rp), excess air in combustion
(EA), turbine inlet temperature (TIT) and inlet air temperature (T0).
At the first stage single objective optimization has been investigated
and the method of Non-dominated Sorting Genetic Algorithm
(NSGA-II) has been used for multi-objective optimization.
Optimization procedures are performed for two and three objective
functions and the results are compared for RIGT Cycle. In order to
investigate the optimal thermodynamic behavior of two objectives,
different set, each including two objectives of output parameters, are
considered individually. For each set Pareto front are depicted. The
sets of selected decision variables based on this Pareto front, will
cause the best possible combination of corresponding objective
functions. There is no superiority for the points on the Pareto front
figure, but they are superior to any other point. In the case of three
objective optimization the results are given in tables.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: In this paper we present a novel technique for data
hiding in binary document images. We use the concept of entropy in
order to identify document specific least distortive areas throughout
the binary document image. The document image is treated as any
other image and the proposed method utilizes the standard document
characteristics for the embedding process. Proposed method
minimizes perceptual distortion due to embedding and allows
watermark extraction without the requirement of any side information
at the decoder end.
Abstract: The mechanical properties of blends consisting of
plasticized poly(vinyl butyral) (PVB) and plasticized poly(vinyl
chloride) (PVC) are studied, in order to evaluate the possibility of
using recycled PVB waste derived from windshields. PVC was
plasticized with 38% of diisononyl phthalate (DINP), while PVB was
plasticized with 28% of triethylene glycol, bis(2-ethylhexanoate)
(3GO). The optimal process conditions for the PVB/PVC blend in 1:1
ratio were determined. Entropy was used in order to theoretically
predict the blends miscibility. The PVB content of each blend
composition used was ranging from zero to 100%. Tensile strength
and strain were tested. In addition, a comparison between recycled
and original PVB, used as constituents of the blend, was performed.
Abstract: In this paper, we propose a novel approach for image
segmentation via fuzzification of Rènyi Entropy of Generalized
Distributions (REGD). The fuzzy REGD is used to precisely measure
the structural information of image and to locate the optimal
threshold desired by segmentation. The proposed approach draws
upon the postulation that the optimal threshold concurs with
maximum information content of the distribution. The contributions
in the paper are as follow: Initially, the fuzzy REGD as a measure of
the spatial structure of image is introduced. Then, we propose an
efficient entropic segmentation approach using fuzzy REGD.
However the proposed approach belongs to entropic segmentation
approaches (i.e. these approaches are commonly applied to grayscale
images), it is adapted to be viable for segmenting color images.
Lastly, diverse experiments on real images that show the superior
performance of the proposed method are carried out.
Abstract: The Chiu-s method which generates a Takagi-Sugeno Fuzzy Inference System (FIS) is a method of fuzzy rules extraction. The rules output is a linear function of inputs. In addition, these rules are not explicit for the expert. In this paper, we develop a method which generates Mamdani FIS, where the rules output is fuzzy. The method proceeds in two steps: first, it uses the subtractive clustering principle to estimate both the number of clusters and the initial locations of a cluster centers. Each obtained cluster corresponds to a Mamdani fuzzy rule. Then, it optimizes the fuzzy model parameters by applying a genetic algorithm. This method is illustrated on a traffic network management application. We suggest also a Mamdani fuzzy rules generation method, where the expert wants to classify the output variables in some fuzzy predefined classes.
Abstract: A new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuoustone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different levels of importance from which the bit stream will be generated. The subcomponents of each level of importance are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several enhance levels.
Abstract: Polarization modulation infrared reflection absorption
spectroscopy (PM-IRRAS) in combination with electrochemistry,
was employed to study the influence of surface charge (potential) on
the kinetics of bovine serum albumin (BSA) adsorption on a
biomedical-grade 316LVM stainless steel surface is discussed. The
BSA adsorption kinetics was found to greatly depend on the surface
potential. With an increase in surface potential towards more
negative values, both the BSA initial adsorption rate and the
equilibrium (saturated) surface concentration also increased. Both
effects were explained on the basis of replacement of well-ordered
water molecules at the 316LVM / solution interface, i.e. by the
increase in entropy of the system.
Abstract: The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the
new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.
Abstract: In the literature of fuzzy measures, there exist many
well known parametric and non-parametric measures, each with its
own merits and limitations. But our main emphasis is on
applications of these measures to a variety of disciplines. To extend
the scope of applications of these fuzzy measures to geometry, we
need some special fuzzy measures. In this communication, we have
introduced two new fuzzy measures involving trigonometric
functions and simultaneously provided their applications to obtain
the basic results already existing in the literature of geometry.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: This paper unifies power optimization approaches in
various energy converters, such as: thermal, solar, chemical, and
electrochemical engines, in particular fuel cells. Thermodynamics
leads to converter-s efficiency and limiting power. Efficiency
equations serve to solve problems of upgrading and downgrading of
resources. While optimization of steady systems applies the
differential calculus and Lagrange multipliers, dynamic optimization
involves variational calculus and dynamic programming. In reacting
systems chemical affinity constitutes a prevailing component of an
overall efficiency, thus the power is analyzed in terms of an active
part of chemical affinity. The main novelty of the present paper in the
energy yield context consists in showing that the generalized heat
flux Q (involving the traditional heat flux q plus the product of
temperature and the sum products of partial entropies and fluxes of
species) plays in complex cases (solar, chemical and electrochemical)
the same role as the traditional heat q in pure heat engines.
The presented methodology is also applied to power limits in fuel
cells as to systems which are electrochemical flow engines propelled
by chemical reactions. The performance of fuel cells is determined by
magnitudes and directions of participating streams and mechanism of
electric current generation. Voltage lowering below the reversible
voltage is a proper measure of cells imperfection. The voltage losses,
called polarization, include the contributions of three main sources:
activation, ohmic and concentration. Examples show power maxima
in fuel cells and prove the relevance of the extension of the thermal
machine theory to chemical and electrochemical systems. The main
novelty of the present paper in the FC context consists in introducing
an effective or reduced Gibbs free energy change between products p
and reactants s which take into account the decrease of voltage and
power caused by the incomplete conversion of the overall reaction.
Abstract: In this paper we present an adaptive method for image
compression that is based on complexity level of the image. The
basic compressor/de-compressor structure of this method is a multilayer
perceptron artificial neural network. In adaptive approach
different Back-Propagation artificial neural networks are used as
compressor and de-compressor and this is done by dividing the
image into blocks, computing the complexity of each block and then
selecting one network for each block according to its complexity
value. Three complexity measure methods, called Entropy, Activity
and Pattern-based are used to determine the level of complexity in
image blocks and their ability in complexity estimation are evaluated
and compared. In training and evaluation, each image block is
assigned to a network based on its complexity value. Best-SNR is
another alternative in selecting compressor network for image blocks
in evolution phase which chooses one of the trained networks such
that results best SNR in compressing the input image block. In our
evaluations, best results are obtained when overlapping the blocks is
allowed and choosing the networks in compressor is based on the
Best-SNR. In this case, the results demonstrate superiority of this
method comparing with previous similar works and JPEG standard
coding.
Abstract: Within the framework of a method of the information
theory it is offered statistics and probabilistic model for definition of
cause-and-effect relations in the coupled multicomponent
subsystems. The quantitative parameter which is defined through
conditional and unconditional entropy functions is introduced. The
method is applied to the analysis of the experimental data on
dynamics of change of the chemical elements composition of plants
organs (roots, reproductive organs, leafs and stems). Experiment is
directed on studying of temporal processes of primary soil formation
and their connection with redistribution dynamics of chemical
elements in plant organs. This statistics and probabilistic model
allows also quantitatively and unambiguously to specify the
directions of the information streams on plant organs.
Abstract: In this work a novel approach for color image
segmentation using higher order entropy as a textural feature for
determination of thresholds over a two dimensional image histogram
is discussed. A similar approach is applied to achieve multi-level
thresholding in both grayscale and color images. The paper discusses
two methods of color image segmentation using RGB space as the
standard processing space. The threshold for segmentation is decided
by the maximization of conditional entropy in the two dimensional
histogram of the color image separated into three grayscale images of
R, G and B. The features are first developed independently for the
three ( R, G, B ) spaces, and combined to get different color
component segmentation. By considering local maxima instead of the
maximum of conditional entropy yields multiple thresholds for the
same image which forms the basis for multilevel thresholding.