Abstract: Existing experiences indicate that one of the most
prominent reasons that some ERP implementations fail is related to
selecting an improper ERP package. Among those important factors
resulting in inappropriate ERP selections, one is to ignore preliminary
activities that should be done before the evaluation of ERP packages.
Another factor yielding these unsuitable selections is that usually
organizations employ prolonged and costly selection processes in
such extent that sometimes the process would never be finalized
or sometimes the evaluation team might perform many key final
activities in an incomplete or inaccurate way due to exhaustion, lack
of interest or out-of-date data. In this paper, a systematic approach
that recommends some activities to be done before and after the
main selection phase is introduced for choosing an ERP package. On
the other hand, the proposed approach has utilized some ideas that
accelerates the selection process at the same time that reduces the
probability of an erroneous final selection.
Abstract: A new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuoustone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different levels of importance from which the bit stream will be generated. The subcomponents of each level of importance are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several enhance levels.
Abstract: We present a dextran modified silicon microring
resonator sensor for high density antibody immobilization. An array
of sensors consisting of three sensor rings and a reference ring was
fabricated and its surface sensitivity and the limit of detection were
obtained using polyelectrolyte multilayers. The mass sensitivity and
the limit of detection of the fabricated sensor ring are 0.35 nm/ng
mm-2 and 42.8 pg/mm2 in air, respectively. Dextran modified sensor
surface was successfully prepared by covalent grafting of oxidized
dextran on 3-aminopropyltriethoxysilane (APTES) modified silicon
sensor surface. The antibody immobilization on hydrogel dextran
matrix improves 40% compared to traditional antibody
immobilization method via APTES and glutaraldehyde linkage.
Abstract: This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.
Abstract: Clustering techniques have been used by many intelligent software agents to group similar access patterns of the Web users into high level themes which express users intentions and interests. However, such techniques have been mostly focusing on one salient feature of the Web document visited by the user, namely the extracted keywords. The major aim of these techniques is to come up with an optimal threshold for the number of keywords needed to produce more focused themes. In this paper we focus on both keyword and similarity thresholds to generate themes with concentrated themes, and hence build a more sound model of the user behavior. The purpose of this paper is two fold: use distance based clustering methods to recognize overall themes from the Proxy log file, and suggest an efficient cut off levels for the keyword and similarity thresholds which tend to produce more optimal clusters with better focus and efficient size.
Abstract: This work proposes an accurate crosstalk noise estimation method in the presence of multiple RLC lines for the use in design automation tools. This method correctly models the loading effects of non switching aggressors and aggressor tree branches using resistive shielding effect and realistic exponential input waveforms. Noise peak and width expressions have been derived. The results obtained are at good agreement with SPICE results. Results show that average error for noise peak is 4.7% and for the width is 6.15% while allowing a very fast analysis.
Abstract: In this paper, collocation based cubic B-spline and
extended cubic uniform B-spline method are considered for
solving one-dimensional heat equation with a nonlocal initial
condition. Finite difference and θ-weighted scheme is used for
time and space discretization respectively. The stability of the
method is analyzed by the Von Neumann method. Accuracy of
the methods is illustrated with an example. The numerical results
are obtained and compared with the analytical solutions.
Abstract: With constraints on data availability and for study of power system stability it is adequate to model the synchronous generator with field circuit and one equivalent damper on q-axis known as the model 1.1. This paper presents a systematic procedure for modelling and simulation of a single-machine infinite-bus power system installed with a thyristor controlled series compensator (TCSC) where the synchronous generator is represented by model 1.1, so that impact of TCSC on power system stability can be more reasonably evaluated. The model of the example power system is developed using MATLAB/SIMULINK which can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, the parameters of the TCSC controller are optimized using genetic algorithm. The non-linear simulation results are presented to validate the effectiveness of the proposed approach.
Abstract: This paper presents the identification of the impact
force acting on a simply supported beam. The force identification is
an inverse problem in which the measured response of the structure is
used to determine the applied force. The identification problem is
formulated as an optimization problem and the genetic algorithm is
utilized to solve the optimization problem. The objective function is
calculated on the difference between analytical and measured
responses and the decision variables are the location and magnitude
of the applied force. The results from simulation show the
effectiveness of the approach and its robustness vs. the measurement
noise and sensor location.
Abstract: Despite the extensive use of eLearning systems, there
is no consensus on a standard framework for evaluating this kind of
quality system. Hence, there is only a minimum set of tools that can
supervise this judgment and gives information about the course
content value. This paper presents two kinds of quality set evaluation
indicators for eLearning courses based on the computational process
of three known metrics, the Euclidian, Hamming and Levenshtein
distances. The “distance" calculus is applied to standard evaluation
templates (i.e. the European Commission Programme procedures vs.
the AFNOR Z 76-001 Standard), determining a reference point in the
evaluation of the e-learning course quality vs. the optimal concept(s).
The case study, based on the results of project(s) developed in the
framework of the European Programme “Leonardo da Vinci", with
Romanian contractors, try to put into evidence the benefits of such a
method.
Abstract: A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.
Abstract: During the last few decades in the academic field, the
debate has increased on the effects of social geography on the
opportunities of socioeconomic integration. On one hand, it has been
discussed how the contents of the urban structure and social
geography affect not only the way people interact, but also their
chances of social and economic integration. On the other hand, it has
also been discussed how the urban structure is also constrained and
transformed by the action of social actors. Without questioning the
powerful influence of structural factors, related to the logic of the
production system, labor markets, education and training, the
research has shown the role played by place of residence in shaping
individual outcomes such as unemployment. In the context of this
debate the importance of territory of residence with respect to the
problem of unemployment has been highlighted.
Although statistics of unemployment have already demonstrated
the unequal incidence of the phenomenon in social groups, the issue
of uneven territorial impact on the phenomenon at intra-urban level
remains relatively unknown.
The purpose of this article is to show and to interpret the spatial
patterns of unemployment in the city of Porto using GIS (Geographic
Information System - GIS) technology. Under this analysis the
overlap of the spatial patterns of unemployment with the spatial
distribution of social housing, allows the discussion of the
relationship that occurs between these patterns and the reasons that
might explain the relative immutability of socioeconomic problems in
some neighborhoods.
Abstract: Existing work in temporal logic on representing the
execution of infinitely many transactions, uses linear-time temporal
logic (LTL) and only models two-step transactions. In this paper,
we use the comparatively efficient branching-time computational tree
logic CTL and extend the transaction model to a class of multistep
transactions, by introducing distinguished propositional variables
to represent the read and write steps of n multi-step transactions
accessing m data items infinitely many times. We prove that the
well known correspondence between acyclicity of conflict graphs
and serializability for finite schedules, extends to infinite schedules.
Furthermore, in the case of transactions accessing the same set of
data items in (possibly) different orders, serializability corresponds
to the absence of cycles of length two. This result is used to give an
efficient encoding of the serializability condition into CTL.
Abstract: In this paper, we consider the global exponential stability of the equilibrium point of Hopfield neural networks with delays and impulsive perturbation. Some new exponential stability criteria of the system are derived by using the Lyapunov functional method and the linear matrix inequality approach for estimating the upper bound of the derivative of Lyapunov functional. Finally, we illustrate two numerical examples showing the effectiveness of our theoretical results.
Abstract: The objective of this paper is to present a research
study of the convectors that are used for heating or cooling of the
living room or industrial halls. The key points are experimental
measurement and comprehensive numerical simulation of the flow
coming throughout the part of the convector such as heat exchanger,
input from the fan etc.. From the obtained results, the components of
the convector are optimized in sense to increase thermal power
efficiency due to improvement of heat convection or reduction of air
drag friction. Both optimized aspects are leading to the more
effective service conditions and to energy saving. The significant part
of the convector research is a design of the unique measurement
laboratory and adopting measure techniques. The new laboratory
provides possibility to measure thermal power efficiency and other
relevant parameters under specific service conditions of the
convectors.
Abstract: The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.
Abstract: WLAN Positioning has been presented by many
approaches in literatures using the characteristics of Received Signal
Strength (RSS), Time of Arrival (TOA) or Time Difference of
Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these,
RSS approach is the simplest method to implement because there is
no need of modification on both access points and client devices
whereas its accuracy is terrible due to physical environments. For
TOA or TDOA approach, the accuracy is quite acceptable but most
researches have to modify either software or hardware on existing
WLAN infrastructure. The scales of modifications are made on only
access card up to the changes in protocol of WLAN. Hence, it is an
unattractive approach to use TOA or TDOA for positioning system.
In this paper, the new concept of merging both RSS and TOA
positioning techniques is proposed. In addition, the method to
achieve TOA characteristic for positioning WLAN user without any
extra modification necessarily appended in the existing system is
presented. The measurement results confirm that the proposed
technique using both RSS and TOA characteristics provides better
accuracy than using only either RSS or TOA approach.
Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: Semiconductor detector arrays are widely used in
high-temperature plasma diagnostics. They have a fast response,
which allows observation of many processes and instabilities in
tokamaks. In this paper, there are reviewed several diagnostics based
on semiconductor arrays as cameras, AXUV photodiodes (referred
often as fast “bolometers") and detectors of both soft X-rays and
visible light installed on the COMPASS tokamak recently. Fresh
results from both spring and summer campaigns in 2012 are
introduced. Examples of the utilization of the detectors are shown on
the plasma shape determination, fast calculation of the radiation
center, two-dimensional plasma radiation tomography in different
spectral ranges, observation of impurity inflow, and also on
investigation of MHD activity in the COMPASS tokamak discharges.
Abstract: Thailand-s health system is challenged by the rising
number of patients and decreasing ratio of medical
practitioners/patients, especially in rural areas. This may tempt
inexperienced GPs to rush through the process of anamnesis with the
risk of incorrect diagnosis. Patients have to travel far to the hospital
and wait for a long time presenting their case. Many patients try to
cure themselves with traditional Thai medicine. Many countries are
making use of the Internet for medical information gathering,
distribution and storage. Telemedicine applications are a relatively
new field of study in Thailand; the infrastructure of ICT had
hampered widespread use of the Internet for using medical
information. With recent improvements made health and technology
professionals can work out novel applications and systems to help
advance telemedicine for the benefit of the people. Here we explore
the use of telemedicine for people with health problems in rural areas
in Thailand and present a Telemedicine Diagnosis System for Rural
Thailand (TEDIST) for diagnosing certain conditions that people
with Internet access can use to establish contact with Community
Health Centers, e.g. by mobile phone. The system uses a Web-based
input method for individual patients- symptoms, which are taken by
an expert system for the analysis of conditions and appropriate
diseases. The analysis harnesses a knowledge base and a backward
chaining component to find out, which health professionals should be
presented with the case. Doctors have the opportunity to exchange
emails or chat with the patients they are responsible for or other
specialists. Patients- data are then stored in a Personal Health Record.