Abstract: Software Entropy Metrics for bug prediction have been validated on various software systems by different researchers. In our previous research, we have validated that Software Entropy Metrics calculated for Mozilla subsystem’s predict the future bugs reasonably well. In this study, the Software Entropy metrics are calculated for a subsystem of Android and it is noticed that these metrics are not suitable for bug prediction. The results are compared with a subsystem of Mozilla and a comparison is made between the two software systems to determine the reasons why Software Entropy metrics are not applicable for Android.
Abstract: Nowadays, robust and secure watermarking algorithm and its optimization have been need of the hour. A watermarking algorithm is presented to achieve the copy right protection of the owner based on visual cryptography, histogram shape property and entropy. In this, both host image and watermark are preprocessed. Host image is preprocessed by using Butterworth filter, and watermark is with visual cryptography. Applying visual cryptography on water mark generates two shares. One share is used for embedding the watermark, and the other one is used for solving any dispute with the aid of trusted authority. Usage of histogram shape makes the process more robust against geometric and signal processing attacks. The combination of visual cryptography, Butterworth filter, histogram, and entropy can make the algorithm more robust, imperceptible, and copy right protection of the owner.
Abstract: Three-dimensional incompressible turbulent fluid flow and heat transfer of pin fin heat sinks using air as a cooling fluid are numerically studied in this study. Two different kinds of pin fins are compared in the thermal performance, including circular and square cross sections, both are in-line and staggered arrangements. The turbulent governing equations are solved using a control-volume- based finite-difference method. Subsequently, numerical computations are performed with the realizable k - ԑ turbulence for the parameters studied, the fin height H, fin diameter D, and Reynolds number (Re) in the range of 7 ≤ H ≤ 10, 0.75 ≤ D ≤ 2, 2000 ≤ Re ≤ 126000 respectively. The numerical results are validated with available experimental data in the literature and good agreement has been found. It indicates that circular pin fins are streamlined in comparing with the square pin fins, the pressure drop is small than that of square pin fins, and heat transfer is not as good as the square pin fins. The thermal performance of the staggered pin fins is better than that of in-line pin fins because the staggered arrangements produce large disturbance. Both in-line and staggered arrangements show the same behavior for thermal resistance, pressure drop, and the entropy generation.
Abstract: By means of the ultrafast X-ray tomography facility, data were obtained at different superficial gas velocities UG in a bubble column (0.1 m in ID) operated with an air-deionized water system at ambient conditions. Raw reconstructed images were treated by both the information entropy (IE) and the reconstruction entropy (RE) algorithms in order to identify the main transition velocities in a bubble column. The IE values exhibited two well-pronounced minima at UG=0.025 m/s and UG=0.085 m/s identifying the boundaries of the homogeneous, transition and heterogeneous regimes. The RE extracted from the central region of the column’s cross-section exhibited only one characteristic peak at UG=0.03 m/s, which was attributed to the transition from the homogeneous to the heterogeneous flow regime. This result implies that the transition regime is non-existent in the core of the column.
Abstract: Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.
Abstract: Although most digital cameras acquire images in a raw
format, based on a Color Filter Array that arranges RGB color
filters on a square grid of photosensors, most image compression
techniques do not use the raw data; instead, they use the rgb result
of an interpolation algorithm of the raw data. This approach is
inefficient and by performing a lossless compression of the raw data,
followed by pixel interpolation, digital cameras could be more power
efficient and provide images with increased resolution given that the
interpolation step could be shifted to an external processing unit. In
this paper, we conduct a survey on the use of lossless compression
algorithms with raw Bayer images. Moreover, in order to reduce the
effect of the transition between colors that increase the entropy of
the raw Bayer image, we split the image into three new images
corresponding to each channel (red, green and blue) and we study
the same compression algorithms applied to each one individually.
This simple pre-processing stage allows an improvement of more than
15% in predictive based methods.
Abstract: This paper presents a multiscale information measure of
Electroencephalogram (EEG) for analysis with a short data length.
A multiscale extension of permutation entropy (MPE) is capable of
fully reflecting the dynamical characteristics of EEG across different
temporal scales. However, MPE yields an imprecise estimation due
to coarse-grained procedure at large scales. We present an improved
MPE measure to estimate entropy more accurately with a short
time series. By computing entropies of all coarse-grained time series
and averaging those at each scale, it leads to the modified MPE
(MMPE) which provides an enhanced accuracy as compared to
MPE. Simulation and experimental studies confirmed that MMPE
has proved its capability over MPE in terms of accuracy.
Abstract: The gas holdup fluctuations in a bubble column (0.15 m in ID) have been recorded by means of a conductivity wire-mesh sensor in order to extract information about the main transition velocities. These parameters are very important for bubble column design, operation and scale-up. For this purpose, the classical definition of the Shannon entropy was modified and used to identify both the onset (at UG=0.034 m/s) of the transition flow regime and the beginning (at UG=0.089 m/s) of the churn-turbulent flow regime. The results were compared with the Kolmogorov entropy (KE) results. A slight discrepancy was found, namely the transition velocities identified by means of the KE were shifted to somewhat higher (0.045 and 0.101 m/s) superficial gas velocities UG.
Abstract: Rough set theory is used to handle uncertainty and incomplete information by applying two accurate sets, Lower approximation and Upper approximation. In this paper, the rough clustering algorithms are improved by adopting the Similarity, Dissimilarity–Similarity and Entropy based initial centroids selection method on three different clustering algorithms namely Entropy based Rough K-Means (ERKM), Similarity based Rough K-Means (SRKM) and Dissimilarity-Similarity based Rough K-Means (DSRKM) were developed and executed by yeast dataset. The rough clustering algorithms are validated by cluster validity indexes namely Rand and Adjusted Rand indexes. An experimental result shows that the ERKM clustering algorithm perform effectively and delivers better results than other clustering methods. Outlier detection is an important task in data mining and very much different from the rest of the objects in the clusters. Entropy based Rough Outlier Factor (EROF) method is seemly to detect outlier effectively for yeast dataset. In rough K-Means method, by tuning the epsilon (ᶓ) value from 0.8 to 1.08 can detect outliers on boundary region and the RKM algorithm delivers better results, when choosing the value of epsilon (ᶓ) in the specified range. An experimental result shows that the EROF method on clustering algorithm performed very well and suitable for detecting outlier effectively for all datasets. Further, experimental readings show that the ERKM clustering method outperformed the other methods.
Abstract: Patient-specific models are instance-based learning
algorithms that take advantage of the particular features of the patient
case at hand to predict an outcome. We introduce two patient-specific
algorithms based on decision tree paradigm that use AUC as a
metric to select an attribute. We apply the patient specific algorithms
to predict outcomes in several datasets, including medical datasets.
Compared to the patient-specific decision path (PSDP) entropy-based
and CART methods, the AUC-based patient-specific decision path
models performed equivalently on area under the ROC curve (AUC).
Our results provide support for patient-specific methods being a
promising approach for making clinical predictions.
Abstract: Multiple Sclerosis (MS) is a disease which affects the
central nervous system and causes balance problem. In clinical, this
disorder is usually evaluated using static posturography. Some linear
or nonlinear measures, extracted from the posturographic data (i.e.
center of pressure, COP) recorded during a balance test, has been
used to analyze postural control of MS patients. In this study, the
trend (TREND) and the sample entropy (SampEn), two nonlinear
parameters were chosen to investigate their relationships with the
expanded disability status scale (EDSS) score. 40 volunteers with
different EDSS scores participated in our experiments with eyes open
(EO) and closed (EC). TREND and 2 types of SampEn (SampEn1
and SampEn2) were calculated for each combined COP’s position
signal. The results have shown that TREND had a weak negative
correlation to EDSS while SampEn2 had a strong positive correlation
to EDSS. Compared to TREND and SampEn1, SampEn2 showed a
better significant correlation to EDSS and an ability to discriminate
the MS patients in the EC case. In addition, the outcome of the study
suggests that the multi-dimensional nonlinear analysis could provide
some information about the impact of disability progression in MS on
dynamics of the COP data.
Abstract: In this numerical work, mixed convection and entropy
generation of Cu–water nanofluid in a lid-driven square cavity have
been investigated numerically using the Lattice Boltzmann Method.
Horizontal walls of the cavity are adiabatic and vertical walls have
constant temperature but different values. The top wall has been
considered as moving from left to right at a constant speed, U0. The
effects of different parameters such as nanoparticle volume
concentration (0–0.05), Rayleigh number (104–106) and Reynolds
numbers (1, 10 and 100) on the entropy generation, flow and
temperature fields are studied. The results have shown that addition
of nanoparticles to the base fluid affects the entropy generation, flow
pattern and thermal behavior especially at higher Rayleigh and low
Reynolds numbers. For pure fluid as well as nanofluid, the increase
of Reynolds number increases the average Nusselt number and the
total entropy generation, linearly. The maximum entropy generation
occurs in nanofluid at low Rayleigh number and at high Reynolds
number. The minimum entropy generation occurs in pure fluid at low
Rayleigh and Reynolds numbers. Also at higher Reynolds number,
the effect of Cu nanoparticles on enhancement of heat transfer was
decreased because the effect of lid-driven cavity was increased. The
present results are validated by favorable comparisons with
previously published results. The results of the problem are presented
in graphical and tabular forms and discussed.
Abstract: We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of a simulation study.
Abstract: This paper treats different aspects of entropy measure
in classical information theory and statistical quantum mechanics, it
presents the possibility of extending the definition of Von Neumann
entropy to image and array processing. In the first part, we generalize
the quantum entropy using singular values of arbitrary rectangular
matrices to measure the randomness and the quality of denoising
operation, this new definition of entropy can be implemented to
compare the performance analysis of filtering methods. In the second
part, we apply the concept of pure state in quantum formalism
to generalize the maximum entropy method for narrowband and
farfield source localization problem. Several computer simulation
results are illustrated to demonstrate the effectiveness of the proposed
techniques.
Abstract: Standard Gibbs energy of formation ΔGfor(298.15) of
lanthanide-iron double oxides of garnet-type crystal structure
R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are
evaluated. The calculation is based on the data of standard entropies
S298.15 and standard enthalpies ΔH298.15 of formation of compounds
which are involved in the process of garnets synthesis. Gibbs energy
of formation is presented as temperature function ΔGfor(T) for the
range 300-1600K. The necessary starting thermodynamic data were
obtained from calorimetric study of heat capacity – temperature
functions and by using the semi-empirical method for calculation of
ΔH298.15 of formation. Thermodynamic functions for standard
temperature – enthalpy, entropy and Gibbs energy - are
recommended as reference data for technological evaluations.
Through the structural series of rare earth-iron garnets the correlation
between thermodynamic properties and characteristics of lanthanide
ions are elucidated.
Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.
Abstract: This study, for its research subjects, uses patients who
had undergone total knee replacement surgery from the database of the
National Health Insurance Administration. Through the review of
literatures and the interviews with physicians, important factors are
selected after careful screening. Then using Cross Entropy Method,
Genetic Algorithm Logistic Regression, and Particle Swarm
Optimization, the weight of each factor is calculated and obtained. In
the meantime, Excel VBA and Case Based Reasoning are combined
and adopted to evaluate the system. Results show no significant
difference found through Genetic Algorithm Logistic Regression and
Particle Swarm Optimization with over 97% accuracy in both
methods. Both ROC areas are above 0.87. This study can provide
critical reference to medical personnel as clinical assessment to
effectively enhance medical care quality and efficiency, prevent
unnecessary waste, and provide practical advantages to resource
allocation to medical institutes.
Abstract: We present a refined multiscale Shannon entropy for
analyzing electroencephalogram (EEG), which reflects the underlying
dynamics of EEG over multiple scales. The rationale behind
this method is that neurological signals such as EEG possess
distinct dynamics over different spectral modes. To deal with the
nonlinear and nonstationary nature of EEG, the recently developed
empirical mode decomposition (EMD) is incorporated, allowing a
decomposition of EEG into its inherent spectral components, referred
to as intrinsic mode functions (IMFs). By calculating the Shannon
entropy of IMFs in a time-dependent manner and summing them over
adaptive multiple scales, it results in an adaptive subscale entropy
measure of EEG. Simulation and experimental results show that
the proposed entropy properly reveals the dynamical changes over
multiple scales.
Abstract: This work assesses the cortical and the sub-cortical
neural activity recorded from rodents using entropy and mutual
information based approaches to study how hypothermia affects neural
activity. By applying the multi-scale entropy and Shannon entropy, we
quantify the degree of the regularity embedded in the cortical and
sub-cortical neurons and characterize the dependency of entropy of
these regions on temperature. We study also the degree of the mutual
information on thalamocortical pathway depending on temperature.
The latter is most likely an indicator of coupling between these highly
connected structures in response to temperature manipulation leading
to arousal after global cerebral ischemia.
Abstract: This work proposes a data-driven multiscale based
quantitative measures to reveal the underlying complexity of
electroencephalogram (EEG), applying to a rodent model of
hypoxic-ischemic brain injury and recovery. Motivated by that real
EEG recording is nonlinear and non-stationary over different
frequencies or scales, there is a need of more suitable approach over
the conventional single scale based tools for analyzing the EEG data.
Here, we present a new framework of complexity measures
considering changing dynamics over multiple oscillatory scales. The
proposed multiscale complexity is obtained by calculating entropies of
the probability distributions of the intrinsic mode functions extracted
by the empirical mode decomposition (EMD) of EEG. To quantify
EEG recording of a rat model of hypoxic-ischemic brain injury
following cardiac arrest, the multiscale version of Tsallis entropy is
examined. To validate the proposed complexity measure, actual EEG
recordings from rats (n=9) experiencing 7 min cardiac arrest followed
by resuscitation were analyzed. Experimental results demonstrate that
the use of the multiscale Tsallis entropy leads to better discrimination
of the injury levels and improved correlations with the neurological
deficit evaluation after 72 hours after cardiac arrest, thus suggesting an
effective metric as a prognostic tool.