Abstract: A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.
Abstract: Variations in the growth rate constant of the Listeria
monocytogenes bacterial species were determined at 37°C in
irradiated environments and compared to the situation of a nonirradiated
environment. The bacteria cells, contained in a suspension
made of a nutrient solution of Brain Heart Infusion, were made to
grow at different frequency (2.30e2.60 GHz) and power (0e400
mW) values, in a plug flow reactor positioned in the irradiated
environment. Then the reacting suspension was made to pass into a
cylindrical cuvette where its optical density was read every 2.5
minutes at a wavelength of 600 nm. The obtained experimental data
of optical density vs. time allowed the bacterial growth rate constant
to be derived; this was found to be slightly influenced by microwave
power, but not by microwave frequency; in particular, a minimum
value was found for powers in the 50e150 mW field.
Abstract: This research examines possible effects of climatic
change focusing on global warming and its impacts on world
agricultural product markets, by using a world food model developed
to consider climate changes. GDP and population for each scenario
were constructed by IPCC and climate data for each scenario was
reported by the Hadley Center and are used in this research to consider
results in different contexts. Production and consumption of primary
agriculture crops of the world for each socio-economic scenario are
obtained and investigated by using the modified world food model.
Simulation results show that crop production in some countries or
regions will have different trends depending on the context. These
alternative contexts depend on the rate of GDP growth, population,
temperature, and rainfall. Results suggest that the development of
environment friendly technologies lead to more consumption of food
in many developing countries. Relationships among environmental
policy, clean energy development, and poverty elimination warrant
further investigation.
Abstract: The subcellular organelles called oil bodies (OBs) are lipid-filled quasi-spherical droplets produced from the endoplasmic reticulum (ER) and then released into the cytoplasm during seed development. It is believed that an OB grows by coalescence with other OBs and that its stability depends on the composition of oleosins, major proteins inserted in the hemi membrane that covers OBs. In this study, we measured the OB-volume distribution from different genotypes of A. thaliana after 7, 8, 9, 10 and 11 days of seed development. In order to test the hypothesis of OBs dynamics, we developed a simple mathematical model using non-linear differential equations inspired from the theory of coagulation. The model describes the evolution of OB-volume distribution during the first steps of seed development by taking into consideration the production of OBs, the increase of triacylglycerol volume to be stored, and the growth by coalescence of OBs. Fitted parameters values show an increase in the OB production and coalescence rates in A. thaliana oleosin mutants compared to wild type.
Abstract: On March 11, 2011, the East coast of Japan was hit by
one of the strongest earthquakes in history, followed by a devastating
tsunami. Although most lifelines, infrastructure, and public facilities
have been restored gradually, recovery efforts in terms of disposal of
disaster waste and revival of primary industry are lagging. This study
presents a summary of the damage inflicted by the earthquake and the
current status of reconstruction in the disaster area. Moreover, we
discuss the current trends and future perspectives on recently
implemented eco-friendly reconstruction projects and focus on the
pro-environmental behavior of disaster victims which is emerging as a
result of the energy shortage after the earthquake. Finally, we offer
ideas for initiatives for the next stage of the reconstruction policies.
Abstract: In recent years, we see an increase of interest for efficient tracking systems in surveillance applications. Many of the proposed techniques are designed for static cameras environments. When the camera is moving, tracking moving objects become more difficult and many techniques fail to detect and track the desired targets. The problem becomes more complex when we want to track a specific object in real-time using a moving Pan and Tilt camera system to keep the target within the image. This type of tracking is of high importance in surveillance applications. When a target is detected at a certain zone, the possibility of automatically tracking it continuously and keeping it within the image until action is taken is very important for security personnel working in very sensitive sites. This work presents a real-time tracking system permitting the detection and continuous tracking of targets using a Pan and Tilt camera platform. A novel and efficient approach for dealing with occlusions is presented. Also a new intelligent forget factor is introduced in order to take into account target shape variations and avoid learning non desired objects. Tests conducted in outdoor operational scenarios show the efficiency and robustness of the proposed approach.
Abstract: ERP systems are the largest software applications adopted by universities, along with quite significant investments in their implementation. However, unlike other applications little research has been conducted regarding these systems in a university environment. This paper aims at providing a critical review of previous research in ERP system in higher education with a special focus on higher education in Australia. The research not only forms the basis of an evaluation of previous research and research needs, it also makes inroads in identifying the payoff of ERPs in the sector from different perspectives with particular reference to the user. The paper is divided into two parts, the first part focuses on ERP literature in higher education at large, while the second focuses on ERP literature in higher education in Australia.
Abstract: In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.
Abstract: PCMs have always been viewed as a suitable
candidate for off peak thermal storage, particularly for refrigeration
systems, due to the high latent energy densities of these materials.
However, due to the need to have them encapsulated within a
container this density is reduced. Furthermore, PCMs have a low
thermal conductivity which reduces the useful amount of energy
which can be stored. To consider these factors, the true energy
storage density of a PCM system was proposed and optimised for
PCMs encapsulated in slabs. Using a validated numerical model of
the system, a parametric study was undertaken to investigate the
impact of the slab thickness, gap between slabs and the mass flow
rate. The study showed that, when optimised, a PCM system can
deliver a true energy storage density between 53% and 83% of the
latent energy density of the PCM.
Abstract: Process-oriented software development is a new
software development paradigm in which software design is modeled
by a business process which is in turn translated into a process
execution language for execution. The building blocks of this
paradigm are software units that are composed together to work
according to the flow of the business process. This new paradigm
still exhibits the characteristic of the applications built with the
traditional software component technology. This paper discusses an
approach to apply a traditional technique for software component
fabrication to the design of process-oriented software units, called
process components. These process components result from
decomposing a business process of a particular application domain
into subprocesses, and these process components can be reused to
design the business processes of other application domains. The
decomposition considers five managerial goals, namely cost
effectiveness, ease of assembly, customization, reusability, and
maintainability. The paper presents how to design or decompose
process components from a business process model and measure
some technical features of the design that would affect the
managerial goals. A comparison between the measurement values
from different designs can tell which process component design is
more appropriate for the managerial goals that have been set. The
proposed approach can be applied in Web Services environment
which accommodates process-oriented software development.
Abstract: This paper presents preliminary results of a
technology assessment analysis for the use of high pressure treatment
(HPT) on Halloumi cheese. In particular, it presents the importance
of this traditional Cyprus cheese to the island-s economy, explains its
production process, and gives a brief introduction to HPT and its
application on cheese. More importantly, it offers preliminary results
of HPT of Halloumi samples and a preliminary economic feasibility
study on the financial implications of the introduction of such
technology.
Abstract: Residues are produced in all stages of human activities
in terms of composition and volume which vary according to
consumption practices and to production methods. Forms of
significant harm to the environment are associated to volume of
generated material as well as to improper disposal of solid wastes,
whose negative effects are noticed more frequently in the long term.
The solution to this problem constitutes a challenge to the
government, industry and society, because they involve economic,
social, environmental and, especially, awareness of the population in
general. The main concerns are focused on the impact it can have on
human health and on the environment (soil, water, air and sights).
The hazardous waste produced mainly by industry, are particularly
worrisome because, when improperly managed, they become a
serious threat to the environment. In view of this issue, this study
aimed to evaluate the management system of solid waste of a coprocessing
industrial waste company, to propose improvements to the
rejects generation management in a specific step of the Blending
production process.
Abstract: SoftBoost is a recently presented boosting algorithm,
which trades off the size of achieved classification margin and
generalization performance. This paper presents a performance
evaluation of SoftBoost algorithm on the generic object recognition
problem. An appearance-based generic object recognition
model is used. The evaluation experiments are performed using
a difficult object recognition benchmark. An assessment with respect
to different degrees of label noise as well as a comparison to
the well known AdaBoost algorithm is performed. The obtained
results reveal that SoftBoost is encouraged to be used in cases
when the training data is known to have a high degree of noise.
Otherwise, using Adaboost can achieve better performance.
Abstract: Training neural networks to capture an intrinsic
property of a large volume of high dimensional data is a difficult
task, as the training process is computationally expensive. Input
attributes should be carefully selected to keep the dimensionality of
input vectors relatively small.
Technical indexes commonly used for stock market prediction
using neural networks are investigated to determine its effectiveness
as inputs. The feed forward neural network of Levenberg-Marquardt
algorithm is applied to perform one step ahead forecasting of
NASDAQ and Dow stock prices.
Abstract: The Bangalore City is facing the acute problem of
pollution in the atmosphere due to the heavy increase in the traffic
and developmental activities in recent years. The present study is an
attempt in the direction to assess trend of the ambient air quality
status of three stations, viz., AMCO Batteries Factory, Mysore Road,
GRAPHITE INDIA FACTORY, KHB Industrial Area, Whitefield
and Ananda Rao Circle, Gandhinagar with respect to some of the
major criteria pollutants such as Total Suspended particular matter
(SPM), Oxides of nitrogen (NOx), and Oxides of sulphur (SO2). The
sites are representative of various kinds of growths viz., commercial,
residential and industrial, prevailing in Bangalore, which are
contributing to air pollution. The concentration of Sulphur Dioxide
(SO2) at all locations showed a falling trend due to use of refined
petrol and diesel in the recent years. The concentration of Oxides of
nitrogen (NOx) showed an increasing trend but was within the
permissible limits. The concentration of the Suspended particular
matter (SPM) showed the mixed trend. The correlation between
model and observed values is found to vary from 0.4 to 0.7 for SO2,
0.45 to 0.65 for NOx and 0.4 to 0.6 for SPM. About 80% of data is
observed to fall within the error band of ±50%. Forecast test for the
best fit models showed the same trend as actual values in most of the
cases. However, the deviation observed in few cases could be
attributed to change in quality of petro products, increase in the
volume of traffic, introduction of LPG as fuel in many types of
automobiles, poor condition of roads, prevailing meteorological
conditions, etc.
Abstract: An electrocardiogram (ECG) feature extraction system
based on the calculation of the complex resonance frequency
employing Prony-s method is developed. Prony-s method is applied
on five different classes of ECG signals- arrhythmia as a finite sum
of exponentials depending on the signal-s poles and the resonant
complex frequencies. Those poles and resonance frequencies of the
ECG signals- arrhythmia are evaluated for a large number of each
arrhythmia. The ECG signals of lead II (ML II) were taken from
MIT-BIH database for five different types. These are the ventricular
couplet (VC), ventricular tachycardia (VT), ventricular bigeminy
(VB), and ventricular fibrillation (VF) and the normal (NR). This
novel method can be extended to any number of arrhythmias.
Different classification techniques were tried using neural networks
(NN), K nearest neighbor (KNN), linear discriminant analysis (LDA)
and multi-class support vector machine (MC-SVM).
Abstract: The Multi-Layered Perceptron (MLP) Neural
networks have been very successful in a number of signal processing
applications. In this work we have studied the possibilities and the
met difficulties in the application of the MLP neural networks for the
prediction of daily solar radiation data. We have used the Polack-Ribière algorithm for training the neural networks. A comparison, in
term of the statistical indicators, with a linear model most used in
literature, is also performed, and the obtained results show that the
neural networks are more efficient and gave the best results.
Abstract: The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.
Abstract: Laboratory activities have produced benefits in
student learning. With current drives of new technology resources
and evolving era of education methods, renewal status of learning
and teaching in laboratory methods are in progress, for both learners
and the educators. To enhance learning outcomes in laboratory works
particularly in engineering practices and testing, learning via handson
by instruction may not sufficient. This paper describes and
compares techniques and implementation of traditional (expository)
with open-ended laboratory (problem-based) for two consecutive
cohorts studying environmental laboratory course in civil engineering
program. The transition of traditional to problem-based findings and
effect were investigated in terms of course assessment student
feedback survey, course outcome learning measurement and student
performance grades. It was proved that students have demonstrated
better performance in their grades and 12% increase in the course
outcome (CO) in problem-based open-ended laboratory style than
traditional method; although in perception, students has responded
less favorable in their feedback.
Abstract: The growing influence of service industries has
prompted greater attention being paid to service operations
management. However, service managers often have difficulty
articulating the veritable effects of their service innovation. Especially,
the performance evaluation process of service innovation problems
generally involves uncertain and imprecise data. This paper presents a
2-tuple fuzzy linguistic computing approach to dealing with
heterogeneous information and information loss problems while the
processes of subjective evaluation integration. The proposed method
based on group decision-making scenario to assist business managers
in measuring performance of service innovation manipulates the
heterogeneity integration processes and avoids the information loss
effectively.