Abstract: This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Abstract: Sensitive and predictive DILI (Drug Induced Liver
Injury) biomarkers are needed in drug R&D to improve early
detection of hepatotoxicity. The discovery of DILI biomarkers that
demonstrate the predictive power to identify individuals at risk to
DILI would represent a major advance in the development of
personalized healthcare approaches. In this healthy volunteer
acetaminophen study (4g/day for 7 days, with 3 monitored nontreatment
days before and 4 after), 450 serum samples from 32
subjects were analyzed using protein profiling by antibody
suspension bead arrays. Multiparallel protein profiles were generated
using a DILI target protein array with 300 antibodies, where the
antibodies were selected based on previous literature findings of
putative DILI biomarkers and a screening process using pre dose
samples from the same cohort. Of the 32 subjects, 16 were found to
develop an elevated ALT value (2Xbaseline, responders). Using the
plasma profiling approach together with multivariate statistical
analysis some novel findings linked to lipid metabolism were found
and more important, endogenous protein profiles in baseline samples
(prior to treatment) with predictive power for ALT elevations were
identified.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: While service quality is acceptably most valued in the tourism industry, the issue of safety and security plays a key role in sustaining the industry success. Such an issue has been part of Thailand-s tourism development and promotion for several years. Evidently, the Tourist Police Department was set up for this purpose. Its main responsibility is to deal with international tourists- safety and confidence in travelling within Thai territory. However, to strengthen the tourism safety of the country, it is important to better understand international tourists- safety concerns about Thailand. This article seeks to compare international tourists- safety needs and Thai tourist polices- perception towards the tourists- safety concern to determine what measure should be taken to assure the tourist of Thailand-s secure environment. Through the employment of quantitative and qualitative methodological approaches, the tourism safety need of international tourists from Europe, North America and Asia was excavated, how Thai tourist polices and local polices perceived the international tourist-s safety concern was investigated, and opinion and experiences about how the police deal with international tourists- problems in eight touristic areas were also explored. A comparative result reveals a certain degrees of differences in international tourists- safety needs and Thai polices- perception towards their needs. The tourism safety prevention and protection measure and practice are also suggested.
Abstract: Developing techniques for mobile robot navigation constitutes one of the major trends in the current
research on mobile robotics. This paper develops a local
model network (LMN) for mobile robot navigation. The
LMN represents the mobile robot by a set of locally valid
submodels that are Multi-Layer Perceptrons (MLPs).
Training these submodels employs Back Propagation (BP) algorithm. The paper proposes the fuzzy C-means (FCM) in this scheme to divide the input space to sub regions, and then a submodel (MLP) is identified to represent a particular
region. The submodels then are combined in a unified
structure. In run time phase, Radial Basis Functions (RBFs) are employed as windows for the activated submodels. This
proposed structure overcomes the problem of changing operating regions of mobile robots. Read data are used in all experiments. Results for mobile robot navigation using the
proposed LMN reflect the soundness of the proposed
scheme.
Abstract: The purpose of this study is comparing and analysing
of the financial characteristics for development methods of the urban development project in the established area, focusing on the
multi-level replotting.
Analysis showed that the type of the lowest expenditure was
'combination type of group-land and multi-level replotting' and the type of the highest profitability was 'multi-level replotting type'. But
'multi-level replotting type' has still risk of amount of cost for the additional architecture. In addition, we subdivided standard amount
for liquidation of replotting and analysed income-expenditure flow.
Analysis showed that both of 'multi-level replotting type' and 'combination type of group-land and multi-level replotting' improved
profitability of project and property change ratio. However, when the
standard was under a certain amount, amount of original property for the replotting was increased exponentially, and profitability of project.
Abstract: In this paper, a model of self-organizing spiking neural networks is introduced and applied to mobile robot environment representation and path planning problem. A network of spike-response-model neurons with a recurrent architecture is used to create robot-s internal representation from surrounding environment. The overall activity of network simulates a self-organizing system with unsupervised learning. A modified A* algorithm is used to find the best path using this internal representation between starting and goal points. This method can be used with good performance for both known and unknown environments.
Abstract: Interaction Model plays an important role in Modelbased
Intelligent Interface Agent Architecture for developing
Intelligent User Interface. In this paper we are presenting some
improvements in the algorithms for development interaction model of
interface agent including: the action segmentation algorithm, the
action pair selection algorithm, the final action pair selection
algorithm, the interaction graph construction algorithm and the
probability calculation algorithm. The analysis of the algorithms also
presented. At the end of this paper, we introduce an experimental
program called “Personal Transfer System".
Abstract: Many digital signal processing, techniques have been used to automatically distinguish protein coding regions (exons) from non-coding regions (introns) in DNA sequences. In this work, we have characterized these sequences according to their nonlinear dynamical features such as moment invariants, correlation dimension, and largest Lyapunov exponent estimates. We have applied our model to a number of real sequences encoded into a time series using EIIP sequence indicators. In order to discriminate between coding and non coding DNA regions, the phase space trajectory was first reconstructed for coding and non-coding regions. Nonlinear dynamical features are extracted from those regions and used to investigate a difference between them. Our results indicate that the nonlinear dynamical characteristics have yielded significant differences between coding (CR) and non-coding regions (NCR) in DNA sequences. Finally, the classifier is tested on real genes where coding and non-coding regions are well known.
Abstract: Technology of thin film deposition is of interest in
many engineering fields, from electronic manufacturing to corrosion
protective coating. A typical deposition process, like that developed
at the University of Eindhoven, considers the deposition of a thin,
amorphous film of C:H or of Si:H on the substrate, using the
Expanding Thermal arc Plasma technique. In this paper a computing
procedure is proposed to simulate the flow field in a deposition
chamber similar to that at the University of Eindhoven and a
sensitivity analysis is carried out in terms of: precursor mass flow
rate, electrical power, supplied to the torch and fluid-dynamic
characteristics of the plasma jet, using different nozzles. To this
purpose a deposition chamber similar in shape, dimensions and
operating parameters to the above mentioned chamber is considered.
Furthermore, a method is proposed for a very preliminary evaluation
of the film thickness distribution on the substrate. The computing
procedure relies on two codes working in tandem; the output from
the first code is the input to the second one. The first code simulates
the flow field in the torch, where Argon is ionized according to the
Saha-s equation, and in the nozzle. The second code simulates the
flow field in the chamber. Due to high rarefaction level, this is a
(commercial) Direct Simulation Monte Carlo code. Gas is a mixture
of 21 chemical species and 24 chemical reactions from Argon plasma
and Acetylene are implemented in both codes. The effects of the
above mentioned operating parameters are evaluated and discussed
by 2-D maps and profiles of some important thermo-fluid-dynamic
parameters, as per Mach number, velocity and temperature. Intensity,
position and extension of the shock wave are evaluated and the
influence of the above mentioned test conditions on the film
thickness and uniformity of distribution are also evaluated.
Abstract: We propose an enhanced collaborative filtering
method using Hofstede-s cultural dimensions, calculated for 111
countries. We employ 4 of these dimensions, which are correlated to
the costumers- buying behavior, in order to detect users- preferences
for items. In addition, several advantages of this method
demonstrated for data sparseness and cold-start users, which are
important challenges in collaborative filtering. We present
experiments using a real dataset, Book Crossing Dataset.
Experimental results shows that the proposed algorithm provide
significant advantages in terms of improving recommendation
quality.
Abstract: This research focus on the intrusion detection system (IDS) development which using artificial immune system (AIS) with population based incremental learning (PBIL). AIS have powerful distinguished capability to extirpate antigen when the antigen intrude into human body. The PBIL is based on past learning experience to adjust new learning. Therefore we propose an intrusion detection system call PBIL-AIS which combine two approaches of PBIL and AIS to evolution computing. In AIS part we design three mechanisms such as clonal selection, negative selection and antibody level to intensify AIS performance. In experimental result, our PBIL-AIS IDS can capture high accuracy when an intrusion connection attacks.
Abstract: In today-s information age, numbers of organizations
are still arguing on capitalizing the values of Information Technology
(IT) and Knowledge Management (KM) to which individuals can
benefit from and effective communication among the individuals can
be established. IT exists in enabling positive improvement for
communication among knowledge workers (k-workers) with a
number of social network technology domains at workplace. The
acceptance of digital discourse in sharing of knowledge and
facilitating the knowledge and information flows at most of the
organizations indeed impose the culture of knowledge sharing in
Digital Social Networks (DSN). Therefore, this study examines
whether the k-workers with IT background would confer an effect on
the three knowledge characteristics -- conceptual, contextual, and
operational. Derived from these three knowledge characteristics, five
potential factors will be examined on the effects of knowledge
exchange via e-mail domain as the chosen query. It is expected, that
the results could provide such a parameter in exploring how DSN
contributes in supporting the k-workers- virtues, performance and
qualities as well as revealing the mutual point between IT and KM.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: A 3.5-bit stage of the CMOS pipelined ADC is proposed. In this report, the main part of 3.5-bit stage ADC is introduced. How the MDAC, comparator and encoder worked and designed are shown in details. Besides, an OTA which is used in fully differential pipelined ADC was described. Using gain-boost architecture with differential amplifier, this OTA achieve high-gain and high-speed. This design was using CMOS 0.18um process and simulation in Cadence. The result of the simulation shows that the OTA has a gain up to 80dB, the unity gain bandwidth of about 1.138GHz with 2pF load.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: This study explores perceptions of English as a Foreign
Language (EFL) learners on using computer mediated communication
technology in their learner of English. The data consists of
observations of both synchronous and asynchronous communication
participants engaged in for over a period of 4 months, which included
online, and offline communication protocols, open-ended interviews
and reflection papers composed by participants.
Content analysis of interview data and the written documents listed
above, as well as, member check and triangulation techniques are the
major data analysis strategies. The findings suggest that participants
generally do not benefit from computer-mediated communication in
terms of its effect in learning a foreign language. Participants regarded
the nature of CMC as artificial, or pseudo communication that did not
aid their authentic communicational skills in English. The results of
this study sheds lights on insufficient and inconclusive findings, which
most quantitative CMC studies previously generated.
Abstract: A new approach for protection of power transformer is
presented using a time-frequency transform known as Wavelet transform.
Different operating conditions such as inrush, Normal, load,
External fault and internal fault current are sampled and processed
to obtain wavelet coefficients. Different Operating conditions provide
variation in wavelet coefficients. Features like energy and Standard
deviation are calculated using Parsevals theorem. These features
are used as inputs to PNN (Probabilistic neural network) for fault
classification. The proposed algorithm provides more accurate results
even in the presence of noise inputs and accurately identifies inrush
and fault currents. Overall classification accuracy of the proposed
method is found to be 96.45%. Simulation of the fault (with and
without noise) was done using MATLAB AND SIMULINK software
taking 2 cycles of data window (40 m sec) containing 800 samples.
The algorithm was evaluated by using 10 % Gaussian white noise.
Abstract: To meet the demands of wireless sensor networks
(WSNs) where data are usually aggregated at a single source prior to
transmitting to any distant user, there is a need to establish a tree
structure inside any given event region. In this paper , a novel
technique to create one such tree is proposed .This tree preserves the
energy and maximizes the lifetime of event sources while they are
constantly transmitting for data aggregation. The term Decentralized
Lifetime Maximizing Tree (DLMT) is used to denote this tree.
DLMT features in nodes with higher energy tend to be chosen as data
aggregating parents so that the time to detect the first broken tree link
can be extended and less energy is involved in tree maintenance. By
constructing the tree in such a way, the protocol is able to reduce the
frequency of tree reconstruction, minimize the amount of data loss
,minimize the delay during data collection and preserves the energy.
Abstract: Telemedicine is brought to life by contemporary changes of our world and summarizes the entire range of services that are at the crossroad of traditional healthcare and information technology. It is believed that eHealth can help in solving critical issues of rising costs, care for ageing and housebound population, staff shortage. It is a feasible tool to provide routine as well as specialized health service as it has the potential to improve both the access to and the standard of care. eHealth is no more an optional choice. It has already made quite a way but it still remains a fantastic challenge for the future requiring cooperation and coordination at all possible levels. The strategic objectives of this paper are: 1. To start with an attempt to clarify the mass of terms used nowadays; 2. To answer the question “Who needs eHealth"; 3. To focus on the necessity of bridging telemedicine and medical (health) informatics as well as on the dual relationship between them; as well as 4. To underline the need of networking in understanding, developing and implementing eHealth.