Abstract: This paper presents a customized deformable model
for the segmentation of abdominal and thoracic aortic aneurysms in
CTA datasets. An important challenge in reliably detecting aortic
aneurysm is the need to overcome problems associated with intensity
inhomogeneities and image noise. Level sets are part of an important
class of methods that utilize partial differential equations (PDEs) and
have been extensively applied in image segmentation. A Gaussian
kernel function in the level set formulation, which extracts the local
intensity information, aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in
segmentation time compared with previous implementations of level
sets. The results indicate the method is more effective than other
approaches in coping with intensity inhomogeneities.
Abstract: Thousands of masters athletes participate
quadrennially in the World Masters Games (WMG), yet this cohort
of athletes remains proportionately under-investigated. Due to a
growing global obesity pandemic in context of benefits of physical
activity across the lifespan, the prevalence of obesity in this unique
population was of particular interest. Data gathered on a sub-sample
of 535 football code athletes, aged 31-72 yrs ( =47.4, s =±7.1),
competing at the Sydney World Masters Games (2009) demonstrated
a significantly (p
Abstract: Themain goal of this article is to find efficient
methods for elemental and molecular analysis of living
microorganisms (algae) under defined environmental conditions and
cultivation processes. The overall knowledge of chemical
composition is obtained utilizing laser-based techniques, Laser-
Induced Breakdown Spectroscopy (LIBS) for acquiring information
about elemental composition and Raman Spectroscopy for gaining
molecular information, respectively. Algal cells were suspended in
liquid media and characterized using their spectra. Results obtained
employing LIBS and Raman Spectroscopy techniques will help to
elucidate algae biology (nutrition dynamics depending on cultivation
conditions) and to identify algal strains, which have the potential for
applications in metal-ion absorption (bioremediation) and biofuel
industry. Moreover, bioremediation can be readily combined with
production of 3rd generation biofuels. In order to use algae for
efficient fuel production, the optimal cultivation parameters have to
be determinedleading to high production of oil in selected
cellswithout significant inhibition of the photosynthetic activity and
the culture growth rate, e.g. it is necessary to distinguish conditions
for algal strain containing high amount of higher unsaturated fatty
acids. Measurements employing LIBS and Raman Spectroscopy were
utilized in order to give information about alga Trachydiscusminutus
with emphasis on the amount of the lipid content inside the algal cell
and the ability of algae to withdraw nutrients from its environment
and bioremediation (elemental composition), respectively. This
article can serve as the reference for further efforts in describing
complete chemical composition of algal samples employing laserablation
techniques.
Abstract: A case study of the generation scheduling optimization
of the multi-hydroplants on the Yuan River Basin in China is reported
in this paper. Concerning the uncertainty of the inflows, the
long/mid-term generation scheduling (LMTGS) problem is solved by
a stochastic model in which the inflows are considered as stochastic
variables. For the short-term generation scheduling (STGS) problem, a
constraint violation priority is defined in case not all constraints are
satisfied. Provided the stage-wise separable condition and low
dimensions, the hydroplant-based operational region schedules
(HBORS) problem is solved by dynamic programming (DP). The
coordination of LMTGS and STGS is presented as well. The
feasibility and the effectiveness of the models and solution methods
are verified by the numerical results.
Abstract: In this research, STNEP is being studied considering network adequacy and limitation of investment cost by decimal codification genetic algorithm (DCGA). The goal is obtaining the maximum of network adequacy with lowest expansion cost for a specific investment. Finally, the proposed idea is applied to the Garvers 6-bus network. The results show that considering the network adequacy for solution of STNEP problem is caused that among of expansion plans for a determined investment, configuration which has relatively lower expansion cost and higher adequacy is proposed by GA based method. Finally, with respect to the curve of adequacy versus expansion cost it can be said that more optimal configurations for expansion of network are obtained with lower investment costs.
Abstract: This paper presents an evolutionary method for designing
electronic circuits and numerical methods associated with
monitoring systems. The instruments described here have been used
in studies of weather and climate changes due to global warming, and
also in medical patient supervision. Genetic Programming systems
have been used both for designing circuits and sensors, and also for
determining sensor parameters. The authors advance the thesis that
the software side of such a system should be written in computer
languages with a strong mathematical and logic background in order
to prevent software obsolescence, and achieve program correctness.
Abstract: Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: The software evolution control requires a deep
understanding of the changes and their impact on different system
heterogeneous artifacts. And an understanding of descriptive
knowledge of the developed software artifacts is a prerequisite
condition for the success of the evolutionary process.
The implementation of an evolutionary process is to make changes
more or less important to many heterogeneous software artifacts such
as source code, analysis and design models, unit testing, XML
deployment descriptors, user guides, and others. These changes can
be a source of degradation in functional, qualitative or behavioral
terms of modified software. Hence the need for a unified approach
for extraction and representation of different heterogeneous artifacts
in order to ensure a unified and detailed description of heterogeneous
software artifacts, exploitable by several software tools and allowing
to responsible for the evolution of carry out the reasoning change
concerned.
Abstract: Heterogeneous repolarization causes dispersion of the T-wave and has been linked to arrhythmogenesis. Such heterogeneities appear due to differential expression of ionic currents in different regions of the heart, both in healthy and diseased animals and humans. Mice are important animals for the study of heart diseases because of the ability to create transgenic animals. We used our previously reported model of mouse ventricular myocytes to develop 2D mouse ventricular tissue model consisting of 14,000 cells (apical or septal ventricular myocytes) and to study the stability of action potential propagation and Ca2+ dynamics. The 2D tissue model was implemented as a FORTRAN program code for highperformance multiprocessor computers that runs on 36 processors. Our tissue model is able to simulate heterogeneities not only in action potential repolarization, but also heterogeneities in intracellular Ca2+ transients. The multicellular model reproduced experimentally observed velocities of action potential propagation and demonstrated the importance of incorporation of realistic Ca2+ dynamics for action potential propagation. The simulations show that relatively sharp gradients of repolarization are predicted to exist in 2D mouse tissue models, and they are primarily determined by the cellular properties of ventricular myocytes. Abrupt local gradients of channel expression can cause alternans at longer pacing basic cycle lengths than gradual changes, and development of alternans depends on the site of stimulation.
Abstract: We explore entanglement in composite quantum systems
and how its peculiar properties are exploited in quantum
information and communication protocols by means of Diagrams
of States, a novel method to graphically represent and analyze how
quantum information is elaborated during computations performed
by quantum circuits.
We present quantum diagrams of states for Bell states generation,
measurements and projections, for dense coding and quantum teleportation,
for probabilistic quantum machines designed to perform
approximate quantum cloning and universal NOT and, finally, for
quantum privacy amplification based on entanglement purification.
Diagrams of states prove to be a useful approach to analyze quantum
computations, by offering an intuitive graphic representation of the
processing of quantum information. They also help in conceiving
novel quantum computations, from describing the desired information
processing to deriving the final implementation by quantum gate
arrays.
Abstract: Nowadays, more engineering systems are using some
kind of Artificial Intelligence (AI) for the development of their
processes. Some well-known AI techniques include artificial neural
nets, fuzzy inference systems, and neuro-fuzzy inference systems
among others. Furthermore, many decision-making applications base
their intelligent processes on Fuzzy Logic; due to the Fuzzy
Inference Systems (FIS) capability to deal with problems that are
based on user knowledge and experience. Also, knowing that users
have a wide variety of distinctiveness, and generally, provide
uncertain data, this information can be used and properly processed
by a FIS. To properly consider uncertainty and inexact system input
values, FIS normally use Membership Functions (MF) that represent
a degree of user satisfaction on certain conditions and/or constraints.
In order to define the parameters of the MFs, the knowledge from
experts in the field is very important. This knowledge defines the MF
shape to process the user inputs and through fuzzy reasoning and
inference mechanisms, the FIS can provide an “appropriate" output.
However an important issue immediately arises: How can it be
assured that the obtained output is the optimum solution? How can it
be guaranteed that each MF has an optimum shape? A viable solution
to these questions is through the MFs parameter optimization. In this
Paper a novel parameter optimization process is presented. The
process for FIS parameter optimization consists of the five simple
steps that can be easily realized off-line. Here the proposed process
of FIS parameter optimization it is demonstrated by its
implementation on an Intelligent Interface section dealing with the
on-line customization / personalization of internet portals applied to
E-commerce.
Abstract: Most known methods for measuring the structural similarity of document structures are based on, e.g., tag measures, path metrics and tree measures in terms of their DOM-Trees. Other methods measures the similarity in the framework of the well known vector space model. In contrast to these we present a new approach to measuring the structural similarity of web-based documents represented by so called generalized trees which are more general than DOM-Trees which represent only directed rooted trees.We will design a new similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as strings of linear integers, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments to solve a novel and challenging problem: Measuring the structural similarity of generalized trees. More precisely, we first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based documents.
Abstract: It has proved that nonlinear diffusion and bilateral
filtering (BF) have a closed connection. Early effort and contribution
are to find a generalized representation to link them by using adaptive
filtering. In this paper a new further relationship between nonlinear
diffusion and bilateral filtering is explored which pays more attention
to numerical calculus. We give a fresh idea that bilateral filtering can
be accelerated by multigrid (MG) scheme which likes the nonlinear
diffusion, and show that a bilateral filtering process with large kernel
size can be approximated by a nonlinear diffusion process based on
full multigrid (FMG) scheme.
Abstract: Today advertising is actively penetrating into many spheres of our lives. We cannot imagine the existence of a lot of economic activities without advertising. That mostly concerns trade and services. Everyone of us should look better into the everyday communication and carefully consider the amount and the quality of the information we receive as well as its influence on our behaviour. Special attention should be paid to the young generation. Theoretical and practical research has proved the ever growing influence of information (especially the one contained in advertising) on a society; on its economics, culture, religion, politics and even people-s private lives and behaviour. Children have plenty of free time and, therefore, see a lot of different advertising. Though education of children is in the hands of parents and schools, advertising makers and customers should think with responsibility about the selection of time and transmission channels of child targeted advertising. The purpose of the present paper is to investigate the influence of advertising upon consumer views and behaviour of children in different age groups. The present investigation has clarified the influence of advertising as a means of information on a certain group of society, which in the modern information society is the most vulnerable – children. In this paper we assess children-s perception and their understanding of advertising.
Abstract: Devices in a pervasive computing system (PCS) are characterized by their context-awareness. It permits them to provide proactively adapted services to the user and applications. To do so, context must be well understood and modeled in an appropriate form which enhance its sharing between devices and provide a high level of abstraction. The most interesting methods for modeling context are those based on ontology however the majority of the proposed methods fail in proposing a generic ontology for context which limit their usability and keep them specific to a particular domain. The adaptation task must be done automatically and without an explicit intervention of the user. Devices of a PCS must acquire some intelligence which permits them to sense the current context and trigger the appropriate service or provide a service in a better suitable form. In this paper we will propose a generic service ontology for context modeling and a context-aware service adaptation based on a service oriented definition of context.
Abstract: In this paper the influence of heterogeneous traffic on
the temporal variation of ambient PM10, PM2.5 and PM1
concentrations at a busy arterial route (Sardar Patel Road) in the
Chennai city has been analyzed. The hourly PM concentration, traffic
counts and average speed of the vehicles have been monitored at the
study site for one week (19th-25th January 2009). Results indicated
that the concentrations of coarse (PM10) and fine PM (PM2.5 and
PM1) concentrations at SP road are having similar trend during peak
and non-peak hours, irrespective of the days. The PM concentrations
showed daily two peaks corresponding to morning (8 to 10 am) and
evening (7 to 9 pm) peak hour traffic flow. The PM10 concentration is
dominated by fine particles (53% of PM2.5 and 45% of PM1). The
high PM2.5/PM10 ratio indicates that the majority of PM10 particles
originate from re-suspension of road dust. The analysis of traffic flow
at the study site showed that 2W, 3W and 4W are having similar
diurnal trend as PM concentrations. This confirms that the 2W, 3W
and 4W are the main emission source contributing to ambient PM
concentration at SP road. The speed measurement at SP road showed
that the average speed of 2W, 3W, 4W, LCV and HCV are 38, 40,
38, 40 and 38 km/hr and 43, 41, 42, 40 and 41 km/hr respectively for
the weekdays and weekdays.
Abstract: Creative design requires new approaches to assessment
in vocational and technological education. To date, there has been little
discussion on instruments used to evaluate dies produced by students
in vocational and technological education. Developing a generic
instrument has been very difficult due to the diversity of creative
domains, the specificity of content, and the subjectivity involved in
judgment. This paper presents an instrument for measuring the
creativity in the design of products by expanding the Consensual
Assessment Technique (CAT). The content-based scale was evaluated
for content validity by 5 experts. The scale comprises 5 criteria:
originality; practicability; precision; aesthetics; and exchangeability.
Nine experts were invited to evaluate the dies produced by 38 college
students who enrolled in a Product Design and Development course.
To further explore the degree of rater agreement, inter-rater reliability
was calculated for each dimension using Kendall's coefficient of
concordance test. The inter-judge reliability scores achieved
significance, with coefficients ranging from 0.53 to 0.71.
Abstract: Debates on residential satisfaction topic have been
vigorously discussed in family house setting. Nonetheless, less or
lack of attention was given to survey on student residential
satisfaction in the campus house setting. This study, however, tried to
fill in the gap by focusing more on the relationship between students-
socio-economic backgrounds and student residential satisfaction with
their on-campus student housing facilities. Two-stage cluster
sampling method was employed to classify the respondents. Then,
self-administered questionnaires were distributed face-to-face to the
students. In general, it was confirmed that the students- socioeconomic
backgrounds have significantly influence the students-
satisfaction with their on-campus student housing facilities. The main
influential factors were revealed as the economic status, sense of
sharing, and the ethnicity of roommates. Likewise, this study could
also provide some useful feedback for the universities administration
in order to improve their student housing facilities.
Abstract: An adaptive Chinese hand-talking system is presented
in this paper. By analyzing the 3 data collecting strategies for new
users, the adaptation framework including supervised and unsupervised
adaptation methods is proposed. For supervised adaptation,
affinity propagation (AP) is used to extract exemplar subsets, and enhanced
maximum a posteriori / vector field smoothing (eMAP/VFS)
is proposed to pool the adaptation data among different models. For
unsupervised adaptation, polynomial segment models (PSMs) are
used to help hidden Markov models (HMMs) to accurately label
the unlabeled data, then the "labeled" data together with signerindependent
models are inputted to MAP algorithm to generate
signer-adapted models. Experimental results show that the proposed
framework can execute both supervised adaptation with small amount
of labeled data and unsupervised adaptation with large amount
of unlabeled data to tailor the original models, and both achieve
improvements on the performance of recognition rate.
Abstract: The current-voltage characteristics of a PtSi/p-Si
Schottky barrier diode was measured at the temperature of 85 K and
from the forward bias region of the I-V curve, the electrical
parameters of the diode were measured by three methods. The results
obtained from the two methods which considered the series resistance
were in close agreement with each other and from them barrier height
(), ideality factor (n) and series resistance () were found to be
0.2045 eV, 2.877 and 14.556 K respectively. By measuring the I-V
characteristics in the temperature range of 85-136 K the electrical
parameters were observed to have strong dependency on temperature.
The increase of barrier height and decrease of ideality factor with
increasing temperature is attributed to the existence of barrier height
inhomogeneities in the silicide-semiconductor structure.