Abstract: This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Developing techniques for mobile robot navigation constitutes one of the major trends in the current
research on mobile robotics. This paper develops a local
model network (LMN) for mobile robot navigation. The
LMN represents the mobile robot by a set of locally valid
submodels that are Multi-Layer Perceptrons (MLPs).
Training these submodels employs Back Propagation (BP) algorithm. The paper proposes the fuzzy C-means (FCM) in this scheme to divide the input space to sub regions, and then a submodel (MLP) is identified to represent a particular
region. The submodels then are combined in a unified
structure. In run time phase, Radial Basis Functions (RBFs) are employed as windows for the activated submodels. This
proposed structure overcomes the problem of changing operating regions of mobile robots. Read data are used in all experiments. Results for mobile robot navigation using the
proposed LMN reflect the soundness of the proposed
scheme.
Abstract: The aim of this study was to compare the effects
of an altitude training camp on heart rate variability and
performance in elite triathletes. Ten athletes completed 20 days of live-high, train-low training at 1650m. Athletes
underwent pre and post 800-m swim time trials at sea-level, and two heart rate variability tests at 1650m on the first and
last day of the training camp. Based on their time trial results,
athletes were divided into responders and non-responders. Relative to the non-responders, the responders sympathetic-toparasympathetic
ratio decreased substantially after 20 days of altitude training (-0.68 ± 1.08 and -1.2 ± 0.96, mean ± 90%
confidence interval for supine and standing respectively). In
addition, sympathetic activity while standing was also
substantially lower post-altitude in the responders compared to the non-responders (-1869 ± 4764 ms2). Results indicate that
responders demonstrated a change to more vagal
predominance compared to non-responders.
Abstract: Many digital signal processing, techniques have been used to automatically distinguish protein coding regions (exons) from non-coding regions (introns) in DNA sequences. In this work, we have characterized these sequences according to their nonlinear dynamical features such as moment invariants, correlation dimension, and largest Lyapunov exponent estimates. We have applied our model to a number of real sequences encoded into a time series using EIIP sequence indicators. In order to discriminate between coding and non coding DNA regions, the phase space trajectory was first reconstructed for coding and non-coding regions. Nonlinear dynamical features are extracted from those regions and used to investigate a difference between them. Our results indicate that the nonlinear dynamical characteristics have yielded significant differences between coding (CR) and non-coding regions (NCR) in DNA sequences. Finally, the classifier is tested on real genes where coding and non-coding regions are well known.
Abstract: Technology of thin film deposition is of interest in
many engineering fields, from electronic manufacturing to corrosion
protective coating. A typical deposition process, like that developed
at the University of Eindhoven, considers the deposition of a thin,
amorphous film of C:H or of Si:H on the substrate, using the
Expanding Thermal arc Plasma technique. In this paper a computing
procedure is proposed to simulate the flow field in a deposition
chamber similar to that at the University of Eindhoven and a
sensitivity analysis is carried out in terms of: precursor mass flow
rate, electrical power, supplied to the torch and fluid-dynamic
characteristics of the plasma jet, using different nozzles. To this
purpose a deposition chamber similar in shape, dimensions and
operating parameters to the above mentioned chamber is considered.
Furthermore, a method is proposed for a very preliminary evaluation
of the film thickness distribution on the substrate. The computing
procedure relies on two codes working in tandem; the output from
the first code is the input to the second one. The first code simulates
the flow field in the torch, where Argon is ionized according to the
Saha-s equation, and in the nozzle. The second code simulates the
flow field in the chamber. Due to high rarefaction level, this is a
(commercial) Direct Simulation Monte Carlo code. Gas is a mixture
of 21 chemical species and 24 chemical reactions from Argon plasma
and Acetylene are implemented in both codes. The effects of the
above mentioned operating parameters are evaluated and discussed
by 2-D maps and profiles of some important thermo-fluid-dynamic
parameters, as per Mach number, velocity and temperature. Intensity,
position and extension of the shock wave are evaluated and the
influence of the above mentioned test conditions on the film
thickness and uniformity of distribution are also evaluated.
Abstract: In today-s information age, numbers of organizations
are still arguing on capitalizing the values of Information Technology
(IT) and Knowledge Management (KM) to which individuals can
benefit from and effective communication among the individuals can
be established. IT exists in enabling positive improvement for
communication among knowledge workers (k-workers) with a
number of social network technology domains at workplace. The
acceptance of digital discourse in sharing of knowledge and
facilitating the knowledge and information flows at most of the
organizations indeed impose the culture of knowledge sharing in
Digital Social Networks (DSN). Therefore, this study examines
whether the k-workers with IT background would confer an effect on
the three knowledge characteristics -- conceptual, contextual, and
operational. Derived from these three knowledge characteristics, five
potential factors will be examined on the effects of knowledge
exchange via e-mail domain as the chosen query. It is expected, that
the results could provide such a parameter in exploring how DSN
contributes in supporting the k-workers- virtues, performance and
qualities as well as revealing the mutual point between IT and KM.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: This study explores perceptions of English as a Foreign
Language (EFL) learners on using computer mediated communication
technology in their learner of English. The data consists of
observations of both synchronous and asynchronous communication
participants engaged in for over a period of 4 months, which included
online, and offline communication protocols, open-ended interviews
and reflection papers composed by participants.
Content analysis of interview data and the written documents listed
above, as well as, member check and triangulation techniques are the
major data analysis strategies. The findings suggest that participants
generally do not benefit from computer-mediated communication in
terms of its effect in learning a foreign language. Participants regarded
the nature of CMC as artificial, or pseudo communication that did not
aid their authentic communicational skills in English. The results of
this study sheds lights on insufficient and inconclusive findings, which
most quantitative CMC studies previously generated.
Abstract: This paper proposes new enhancement models to the
methods of nonlinear anisotropic diffusion to greatly reduce speckle
and preserve image features in medical ultrasound images. By
incorporating local physical characteristics of the image, in this case
scatterer density, in addition to the gradient, into existing tensorbased
image diffusion methods, we were able to greatly improve the
performance of the existing filtering methods, namely edge
enhancing (EE) and coherence enhancing (CE) diffusion. The new
enhancement methods were tested using various ultrasound images,
including phantom and some clinical images, to determine the
amount of speckle reduction, edge, and coherence enhancements.
Scatterer density weighted nonlinear anisotropic diffusion
(SDWNAD) for ultrasound images consistently outperformed its
traditional tensor-based counterparts that use gradient only to weight
the diffusivity function. SDWNAD is shown to greatly reduce
speckle noise while preserving image features as edges, orientation
coherence, and scatterer density. SDWNAD superior performances
over nonlinear coherent diffusion (NCD), speckle reducing
anisotropic diffusion (SRAD), adaptive weighted median filter
(AWMF), wavelet shrinkage (WS), and wavelet shrinkage with
contrast enhancement (WSCE), make these methods ideal
preprocessing steps for automatic segmentation in ultrasound
imaging.
Abstract: To meet the demands of wireless sensor networks
(WSNs) where data are usually aggregated at a single source prior to
transmitting to any distant user, there is a need to establish a tree
structure inside any given event region. In this paper , a novel
technique to create one such tree is proposed .This tree preserves the
energy and maximizes the lifetime of event sources while they are
constantly transmitting for data aggregation. The term Decentralized
Lifetime Maximizing Tree (DLMT) is used to denote this tree.
DLMT features in nodes with higher energy tend to be chosen as data
aggregating parents so that the time to detect the first broken tree link
can be extended and less energy is involved in tree maintenance. By
constructing the tree in such a way, the protocol is able to reduce the
frequency of tree reconstruction, minimize the amount of data loss
,minimize the delay during data collection and preserves the energy.
Abstract: Telemedicine is brought to life by contemporary changes of our world and summarizes the entire range of services that are at the crossroad of traditional healthcare and information technology. It is believed that eHealth can help in solving critical issues of rising costs, care for ageing and housebound population, staff shortage. It is a feasible tool to provide routine as well as specialized health service as it has the potential to improve both the access to and the standard of care. eHealth is no more an optional choice. It has already made quite a way but it still remains a fantastic challenge for the future requiring cooperation and coordination at all possible levels. The strategic objectives of this paper are: 1. To start with an attempt to clarify the mass of terms used nowadays; 2. To answer the question “Who needs eHealth"; 3. To focus on the necessity of bridging telemedicine and medical (health) informatics as well as on the dual relationship between them; as well as 4. To underline the need of networking in understanding, developing and implementing eHealth.
Abstract: Under the limitation of investment budget, a utility
company is required to maximize the utilization of their existing
assets during their life cycle satisfying both engineering and financial
requirements. However, utility does not have knowledge about the
status of each asset in the portfolio neither in terms of technical nor
financial values. This paper presents a knowledge based model for
the utility companies in order to make an optimal decision on power
transformer with their utilization. CommonKADS methodology, a
structured development for knowledge and expertise representation,
is utilized for designing and developing knowledge based model. A
case study of One MVA power transformer of Nepal Electricity
Authority is presented. The results show that the reusable knowledge
can be categorized, modeled and utilized within the utility company
using the proposed methodologies. Moreover, the results depict that
utility company can achieve both engineering and financial benefits
from its utilization.
Abstract: This paper presents a generalized form of the
mechanistic deconvolution technique (GMD) to modeling image sensors applicable in various pan–tilt planes of view. The mechanistic deconvolution technique (UMD) is modified with the
given angles of a pan–tilt plane of view to formulate constraint parameters and characterize distortion effects, and thereby, determine
the corrected image data. This, as a result, does not require experimental setup or calibration. Due to the mechanistic nature of
the sensor model, the necessity for the sensor image plane to be
orthogonal to its z-axis is eliminated, and it reduces the dependency on image data. An experiment was constructed to evaluate the
accuracy of a model created by GMD and its insensitivity to changes in sensor properties and in pan and tilt angles. This was compared
with a pre-calibrated model and a model created by UMD using two sensors with different specifications. It achieved similar accuracy
with one-seventh the number of iterations and attained lower mean error by a factor of 2.4 when compared to the pre-calibrated and
UMD model respectively. The model has also shown itself to be robust and, in comparison to pre-calibrated and UMD model, improved the accuracy significantly.
Abstract: The upgrading of low quality crude natural gas (NG) is attracting interest due to high demand of pipeline-grade gas in recent years. Membrane processes are commercially proven technology for the removal of impurities like carbon dioxide from NG. In this work, cross flow mathematical model has been suggested to be incorporated with ASPEN HYSYS as a user defined unit operation in order to design the membrane system for CO2/CH4 separation. The effect of operating conditions (such as feed composition and pressure) and membrane selectivity on the design parameters (methane recovery and total membrane area required for the separation) has been studied for different design configurations. These configurations include single stage (with and without recycle) and double stage membrane systems (with and without permeate or retentate recycle). It is shown that methane recovery can be improved by recycling permeate or retentate stream as well as by using double stage membrane systems. The ASPEN HYSYS user defined unit operation proposed in the study has potential to be applied for complex membrane system design and optimization.
Abstract: Among the numerous economic evaluation techniques currently available, Multi-criteria Spatial Analysis lends itself to solving localization problems of property complexes and, in particular, production plants. The methodology involves the use of Geographical Information Systems (GIS) and the mapping overlay technique, which overlaps the different information layers of a territory in order to obtain an overview of the parameters that characterize it. This first phase is used to detect possible settlement surfaces of a new agglomeration, subsequently selected through Analytic Hierarchy Process (AHP), so as to choose the best alternative. The result ensures the synthesis of a multidimensional profile that expresses both the quantitative and qualitative effects. Each criterion can be given a different weight.
Abstract: Motion capturing technology has been used for quite a
while and several research has been done within this area. Nevertheless,
we discovered open issues within current motion capturing
environments. In this paper we provide a state-of-the-art overview of
the addressed research areas and show issues with current motion
capturing environments. Observations, interviews and questionnaires
have been used to reveal the challenges actors are currently facing in
a motion capturing environment. Furthermore, the idea to create a
more immersive motion capturing environment to improve the acting
performances and motion capturing outcomes as a potential solution
is introduced. It is hereby the goal to explain the found open issues
and the developed ideas which shall serve for further research as a
basis. Moreover, a methodology to address the interaction and
systems design issues is proposed. A future outcome could be that
motion capture actors are able to perform more naturally, especially
if using a non-body-worn solution.
Abstract: In this study, a Loop Back Algorithm for component
connected labeling for detecting objects in a digital image is
presented. The approach is using loop back connected component
labeling algorithm that helps the system to distinguish the object
detected according to their label. Deferent than whole window
scanning technique, this technique reduces the searching time for
locating the object by focusing on the suspected object based on
certain features defined. In this study, the approach was also
implemented for a face detection system. Face detection system is
becoming interesting research since there are many devices or
systems that require detecting the face for certain purposes. The input
can be from still image or videos, therefore the sub process of this
system has to be simple, efficient and accurate to give a good result.
Abstract: Rapid growth of distance learning resulted in
importance to conduct research on students- satisfaction with distance
learning because differences in students- satisfaction might influence
educational opportunities for learning in a relevant Web-based
environment. In line with this, this paper deals with satisfaction of
students with distance module at Faculty of organizational sciences
(FOS) in Serbia as well as some factors affecting differences in their
satisfaction . We have conducted a research on a population of 68
first-year students of distance learning studies at FOS. Using
statistical techniques, we have found out that there is no significant
difference in students- satisfaction with distance learning module
between men and women. In the same way, we also concluded that
there is a difference in satisfaction with distance learning module
regarding to student-s perception of opportunity to gain knowledge as
the classic students.