Abstract: This paper treats different aspects of entropy measure
in classical information theory and statistical quantum mechanics, it
presents the possibility of extending the definition of Von Neumann
entropy to image and array processing. In the first part, we generalize
the quantum entropy using singular values of arbitrary rectangular
matrices to measure the randomness and the quality of denoising
operation, this new definition of entropy can be implemented to
compare the performance analysis of filtering methods. In the second
part, we apply the concept of pure state in quantum formalism
to generalize the maximum entropy method for narrowband and
farfield source localization problem. Several computer simulation
results are illustrated to demonstrate the effectiveness of the proposed
techniques.
Abstract: In this paper, monitoring and control of tap changer
mechanism of a transformer implementation in an Intelligent
Electronic Device (IED) is discussed. It has been a custom for
decades to provide a separate panel for on load tap changer control
for monitoring the tap position. However, this facility cannot either
record or transfer the information to remote control centers. As there
is a technology shift towards the smart grid protection and control
standards, the need for implementing remote control and monitoring
has necessitated the implementation of this feature in numerical
relays. This paper deals with the programming, settings and logic
implementation which is applicable to both IEC 61850 compatible
and non-compatible IEDs thereby eliminating the need for separate
tap changer control equipment. The monitoring mechanism has been
implemented in a 28MVA, 110 /6.9kV transformer with 16 tap
position with GE make T60 IED at Ultratech cement limited
Gulbarga, Karnataka and is in successful service.
Abstract: Consumers are demanding novel beverages that are
healthier, convenient and have appealing consumer acceptance. The
objectives of this study were to investigate the effects of adding grape
polyphenols and the influence of presenting health claims on the
sensory acceptability of wines. Fresh red sorrel calyces were
fermented into wines. The total soluble solids of the pectinase-treated
sorrel puree were from 4°Brix to 23.8°Brix. Polyphenol in the form
of grape pomace extract was added to sorrel wines (w/v) in specified
levels to give 0. 25. 50 and 75 ppm. A focus group comprising of 12
panelists was use to select the level of polyphenol to be added to
sorrel wines for sensory preference The sensory attributed of the
wines which were evaluated were colour, clarity, aroma, flavor,
mouth-feel, sweetness, astringency and overall preference. The sorrel
wine which was most preferred from focus group evaluation was
presented for hedonic rating. In the first stage of hedonic testing, the
sorrel wine was served chilled at 7°C for 24 h prior to sensory
evaluation. Each panelist was provided with a questionnaire and was
asked to rate the wines on colour, aroma, flavor, mouth-feel,
sweetness, astringency and overall acceptability using a 9-point
hedonic scale. In the second stage of hedonic testing, the panelist
were instructed to read a health abstract on the health benefits of
polyphenolic compounds and again to rate sorrel wine with added 25
ppm polyphenol. Paired t-test was used for the analysis of the
influence of presenting health information on polyphenols on hedonic
scoring of sorrel wines. Focus groups found that the addition of
polyphenol addition had no significant effect on sensory color and
aroma but affected clarity and flavor. A 25 ppm wine was liked
moderately in overall acceptability. The presentation of information
on the health benefit of polyphenols in sorrel wines to panelists had
no significant influence on the sensory acceptance of wine. More
than half of panelists would drink this wine now and then. This wine
had color L 19.86±0.68, chroma 2.10±0.12, hue° 16.90 ±3.10 and
alcohol content of 13.0%. The sorrel wine was liked moderately in
overall acceptability with the added polyphenols.
Abstract: This work is on decision tree-based classification for
the disbursement of scholarship. Tree-based data mining
classification technique is used in other to determine the generic rule
to be used to disburse the scholarship. The system based on the
defined rules from the tree is able to determine the class (status) to
which an applicant shall belong whether Granted or Not Granted. The
applicants that fall to the class of granted denote a successful
acquirement of scholarship while those in not granted class are
unsuccessful in the scheme. An algorithm that can be used to classify
the applicants based on the rules from tree-based classification was
also developed. The tree-based classification is adopted because of its
efficiency, effectiveness, and easy to comprehend features. The
system was tested with the data of National Information Technology
Development Agency (NITDA) Abuja, a Parastatal of Federal
Ministry of Communication Technology that is mandated to develop
and regulate information technology in Nigeria. The system was
found working according to the specification. It is therefore
recommended for all scholarship disbursement organizations.
Abstract: The aim of this work was to characterize a potential
target group of people interested in participating into a training
program in organic farming in the context of mobile-learning. The
information sought addressed in particular, but not exclusively,
possible contents, formats and forms of evaluation that will
contribute to define the course objectives and curriculum, as well as
to ensure that the course meets the needs of the learners and their
preferences. The sample was selected among different European
countries. The questionnaires were delivered electronically for
answering on-line and in the end 135 consented valid questionnaires
were obtained. The results allowed characterizing the target group
and identifying their training needs and preferences towards m-learning
formats, giving valuable tools to design the training offer.
Abstract: One of the most critical decision points in the design of a
face recognition system is the choice of an appropriate face representation.
Effective feature descriptors are expected to convey sufficient, invariant
and non-redundant facial information. In this work we propose a set of
Hahn moments as a new approach for feature description. Hahn moments
have been widely used in image analysis due to their invariance, nonredundancy
and the ability to extract features either globally and locally.
To assess the applicability of Hahn moments to Face Recognition we
conduct two experiments on the Olivetti Research Laboratory (ORL)
database and University of Notre-Dame (UND) X1 biometric collection.
Fusion of the global features along with the features from local facial
regions are used as an input for the conventional k-NN classifier. The
method reaches an accuracy of 93% of correctly recognized subjects for
the ORL database and 94% for the UND database.
Abstract: This research presents the main ideas to implement an
intelligent system composed by communicating wireless sensors
measuring environmental data linked to drought indicators (such as
air temperature, soil moisture , etc...). On the other hand, the setting
up of a spatio temporal database communicating with a Web mapping
application for a monitoring in real time in activity 24:00 /day, 7
days/week is proposed to allow the screening of the drought
parameters time evolution and their extraction. Thus this system
helps detecting surfaces touched by the phenomenon of drought.
Spatio-temporal conceptual models seek to answer the users who
need to manage soil water content for irrigating or fertilizing or other
activities pursuing crop yield augmentation. Effectively, spatiotemporal
conceptual models enable users to obtain a diagram of
readable and easy data to apprehend. Based on socio-economic
information, it helps identifying people impacted by the phenomena
with the corresponding severity especially that this information is
accessible by farmers and stakeholders themselves. The study will be
applied in Siliana watershed Northern Tunisia.
Abstract: DNA analysis has been widely accepted as providing
valuable evidence concerning the identity of the source of biological
traces. Our work has showed that DNA samples can survive on
cartridges even after firing. The study also raised the possibility of
determining other information such as the age of the donor. Such
information may be invaluable in certain cases where spent cartridges
from automatic weapons are left behind at the scene of a crime. In
spite of the nature of touch evidence and exposure to high chamber
temperatures during shooting, we were still capable to retrieve
enough DNA for profile typing. In order to estimate age of
contributor, DNA methylation levels were analyzed using EpiTect
system for retrieved DNA. However, results were not conclusive, due
to low amount of input DNA.
Abstract: Geographical routing protocol requires node physical
location information to make forwarding decision. Geographical
routing uses location service or position service to obtain the position
of a node. The geographical information is a geographic coordinates
or can be obtained through reference points on some fixed coordinate
system. Link can be formed between two nodes. Link lifetime plays a
crucial role in MANET. Link lifetime represent how long the link is
stable without any failure between the nodes. Link failure may occur
due to mobility and because of link failure energy of nodes can be
drained. Thus this paper proposes survey about link lifetime
prediction using geographical information.
Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.
Abstract: In this paper, the secure BioSemantic Scheme is
presented to bridge biological/biomedical research problems and
computational solutions via semantic computing. Due to the diversity
of problems in various research fields, the semantic capability
description language (SCDL) plays and important role as a common
language and generic form for problem formalization. SCDL is
expected the essential for future semantic and logical computing in
Biosemantic field. We show several example to Biomedical problems
in this paper. Moreover, in the coming age of cloud computing, the
security problem is considered to be crucial issue and we presented a
practical scheme to cope with this problem.
Abstract: This study identifies factors underlying the digital
divide that is faced by the disabled. The results of its analysis showed
that the digital divide in PC use is affected by age, number of years of
education, employment status, and household income of more than
KRW 3 million. The digital divide in smart device use is affected by
sex, age, number of years of education, time when disability struck,
and household income of more than KRW 3 million. Based on these
results, this study proposes methods for bridging the digital divide
faced by the disabled.
Abstract: This paper presents the local mesh co-occurrence
patterns (LMCoP) using HSV color space for image retrieval system.
HSV color space is used in this method to utilize color, intensity and
brightness of images. Local mesh patterns are applied to define the
local information of image and gray level co-occurrence is used to
obtain the co-occurrence of LMeP pixels. Local mesh co-occurrence
pattern extracts the local directional information from local mesh
pattern and converts it into a well-mannered feature vector using gray
level co-occurrence matrix. The proposed method is tested on three
different databases called MIT VisTex, Corel, and STex. Also, this
algorithm is compared with existing methods, and results in terms of
precision and recall are shown in this paper.
Abstract: Phonocardiography is important in appraisal of
congenital heart disease and pulmonary hypertension as it reflects the
duration of right ventricular systoles. The systolic murmur in patients
with intra-cardiac shunt decreases as pulmonary hypertension
develops and may eventually disappear completely as the pulmonary
pressure reaches systemic level. Phonocardiography and auscultation
are non-invasive, low-cost, and accurate methods to assess heart
disease. In this work an objective signal processing tool to extract
information from phonocardiography signal using Wavelet is
proposed to classify the murmur as normal or abnormal. Since the
feature vector is large, a Binary Particle Swarm Optimization (PSO)
with mutation for feature selection is proposed. The extracted
features improve the classification accuracy and were tested across
various classifiers including Naïve Bayes, kNN, C4.5, and SVM.
Abstract: The enormous amount of information stored on the
web increases from one day to the next, exposing the web currently
faced with the inevitable difficulties of research pertinent information
that users really want. The problem today is not limited to expanding
the size of the information highways, but to design a system for
intelligent search. The vast majority of this information is stored in
relational databases, which in turn represent a backend for managing
RDF data of the semantic web. This problem has motivated us to
write this paper in order to establish an effective approach to support
semantic transformation algorithm for SPARQL queries to SQL
queries, more precisely SPARQL SELECT queries; by adopting this
method, the relational database can be questioned easily with
SPARQL queries maintaining the same performance.
Abstract: This paper shows the general perceptions of Spanish
university stakeholders in relation to the university’s annual reports
and the adequacy and potential of intellectual capital reporting. To
this end, a questionnaire was designed and sent to every member of
the Social Councils of Spanish public universities. It was thought that
these participants would provide a good example of the attitude of
university stakeholders since they represent the different social
groups connected with universities. From the results of this study we
are in the position of confirming the need for universities to offer
information on intellectual capital in their accounting information
model.
Abstract: This study compares the intensity of game load among
player positions and between the 1st and the 2nd half of the games.
Two guards, three forwards, and three centers (female basketball
players) participated in this study. The heart rate (HR) and its
development were monitored during two competitive games.
Statistically insignificant differences in the intensity of game load
were recorded between guards, forwards, and centers below and
above 85% of the maximal heart rate (HRmax) and in the mean HR as
% of HRmax (87.81±3.79%, 87.02±4.37%, and 88.76±3.54%,
respectively). Moreover, when the 1st and the 2nd half of the games
were compared in the mean HR (87.89±4.18% vs. 88.14±3.63% of
HRmax), no statistical significance was recorded. This information can
be useful for coaching staff, to manage and to precisely plan the
training process.
Abstract: This research aims to identify traditional Mon cuisines
as well as gather and classify traditional cuisines of Mon
communities in Bangkok. The studying of this research is used by
methodology of the quantitative research. Using the questionnaire as
the method in collecting information from sampling totally amount of
450 persons analyzed via frequency, percentage and mean value. The
results showed that a variety of traditional Mon cuisines of Bangkok
could split into 6 categories of meat diet with 54 items and 6
categories of desserts with 19 items.
Abstract: DNA Barcode provides good sources of needed
information to classify living species. The classification problem has
to be supported with reliable methods and algorithms. To analyze
species regions or entire genomes, it becomes necessary to use the
similarity sequence methods. A large set of sequences can be
simultaneously compared using Multiple Sequence Alignment which
is known to be NP-complete. However, all the used methods are still
computationally very expensive and require significant computational
infrastructure. Our goal is to build predictive models that are highly
accurate and interpretable. In fact, our method permits to avoid the
complex problem of form and structure in different classes of
organisms. The empirical data and their classification performances
are compared with other methods. Evenly, in this study, we present
our system which is consisted of three phases. The first one, is called
transformation, is composed of three sub steps; Electron-Ion
Interaction Pseudopotential (EIIP) for the codification of DNA
Barcodes, Fourier Transform and Power Spectrum Signal Processing.
Moreover, the second phase step is an approximation; it is
empowered by the use of Multi Library Wavelet Neural Networks
(MLWNN). Finally, the third one, is called the classification of DNA
Barcodes, is realized by applying the algorithm of hierarchical
classification.
Abstract: In this paper, we present a new segmentation approach
for liver lesions in regions of interest within MRI (Magnetic
Resonance Imaging). This approach, based on a two-cluster Fuzzy CMeans
methodology, considers the parameter variable compactness
to handle uncertainty. Fine boundaries are detected by a local
recursive merging of ambiguous pixels with a sequential forward
floating selection with Zernike moments. The method has been tested
on both synthetic and real images. When applied on synthetic images,
the proposed approach provides good performance, segmentations
obtained are accurate, their shape is consistent with the ground truth,
and the extracted information is reliable. The results obtained on MR
images confirm such observations. Our approach allows, even for
difficult cases of MR images, to extract a segmentation with good
performance in terms of accuracy and shape, which implies that the
geometry of the tumor is preserved for further clinical activities (such
as automatic extraction of pharmaco-kinetics properties, lesion
characterization, etc.).