Abstract: In this research, a latent class vector model for pairwise data is formulated. As compared to the basic vector model, this model yields consistent estimates of the parameters since the number of parameters to be estimated does not increase with the number of subjects. The result of the analysis reveals that the model was stable and could classify each subject to the latent classes representing the typical scales used by these subjects.
Abstract: The accuracy of estimated stability and control
derivatives of a light aircraft from flight test data were evaluated. The light aircraft, named ChangGong-91, is the first certified aircraft from
the Korean government. The output error method, which is a maximum likelihood estimation technique and considers measurement
noise only, was used to analyze the aircraft responses measures. The
multi-step control inputs were applied in order to excite the short period mode for the longitudinal and Dutch-roll mode for the lateral-directional motion. The estimated stability/control derivatives of Chan Gong-91 were analyzed for the assessment of handling
qualities comparing them with those of similar aircraft. The accuracy of the flight derivative estimates derived from flight test measurement
was examined in engineering judgment, scatter and Cramer-Rao bound, which turned out to be satisfactory with minor defects..
Abstract: Through 1980s, management accounting researchers
described the increasing irrelevance of traditional control and
performance measurement systems. The Balanced Scorecard (BSC)
is a critical business tool for a lot of organizations. It is a
performance measurement system which translates mission and
strategy into objectives. Strategy map approach is a development
variant of BSC in which some necessary causal relations must be
established. To recognize these relations, experts usually use
experience. It is also possible to utilize regression for the same
purpose. Structural Equation Modeling (SEM), which is one of the
most powerful methods of multivariate data analysis, obtains more
appropriate results than traditional methods such as regression. In the
present paper, we propose SEM for the first time to identify the
relations between objectives in the strategy map, and a test to
measure the importance of relations. In SEM, factor analysis and test
of hypotheses are done in the same analysis. SEM is known to be
better than other techniques at supporting analysis and reporting. Our
approach provides a framework which permits the experts to design
the strategy map by applying a comprehensive and scientific method
together with their experience. Therefore this scheme is a more
reliable method in comparison with the previously established
methods.
Abstract: As a structure for processing string problem, suffix
array is certainly widely-known and extensively-studied. But if the
string access pattern follows the “90/10" rule, suffix array can not take
advantage of the fact that we often find something that we have just
found. Although the splay tree is an efficient data structure for small
documents when the access pattern follows the “90/10" rule, it
requires many structures and an excessive amount of pointer
manipulations for efficiently processing and searching large
documents. In this paper, we propose a new and conceptually powerful
data structure, called splay suffix arrays (SSA), for string search. This
data structure combines the features of splay tree and suffix arrays into
a new approach which is suitable to implementation on both
conventional and clustered computers.
Abstract: The increasing popularity of wireless technologies
and mobile computing devices has enabled new application areas and
research. One of these new areas is pervasive systems in urban
environments, because urban environments are characterized by high
concentration of these technologies and devices. In this paper we will
show the process of pervasive system design in urban environments,
using as use case a local zoo in Cali, Colombia. Based on an
ethnographic studio, we present the design of a pervasive system for
urban computing based on service oriented architecture to controlled
environment of Cali Zoo. In this paper, the reader will find a
methodological approach for the design of similar systems, using
data collection methods, conceptual frameworks for urban
environments and considerations of analysis and design of service
oriented systems.
Abstract: The mobile users with Laptops need to have an
efficient access to i.e. their home personal data or to the Internet from
any place in the world, regardless of their location or point of
attachment, especially while roaming outside the home subnet. An
efficient interpretation of packet losses problem that is encountered
from this roaming is to the centric of all aspects in this work, to be
over-highlighted. The main previous works, such as BER-systems,
Amigos, and ns-2 implementation that are considered to be in
conjunction with that problem under study are reviewed and
discussed. Their drawbacks and limitations, of stopping only at
monitoring, and not to provide an actual solution for eliminating or
even restricting these losses, are mentioned. Besides that, the
framework around which we built a Triple-R sequence as a costeffective
solution to eliminate the packet losses and bridge the gap
between subnets, an area that until now has been largely neglected, is
presented. The results show that, in addition to the high bit error rate
of wireless mobile networks, mainly the low efficiency of mobile-IP
registration procedure is a direct cause of these packet losses.
Furthermore, the output of packet losses interpretation resulted an
illustrated triangle of the registration process. This triangle should be
further researched and analyzed in our future work.
Abstract: The objective of this paper is to a design of pattern
classification model based on the back-propagation (BP) algorithm for
decision support system. Standard BP model has done full connection
of each node in the layers from input to output layers. Therefore, it
takes a lot of computing time and iteration computing for good
performance and less accepted error rate when we are doing some
pattern generation or training the network.
However, this model is using exclusive connection in between
hidden layer nodes and output nodes. The advantage of this model is
less number of iteration and better performance compare with standard
back-propagation model. We simulated some cases of classification
data and different setting of network factors (e.g. hidden layer number
and nodes, number of classification and iteration). During our
simulation, we found that most of simulations cases were satisfied by
BP based using exclusive connection network model compared to
standard BP. We expect that this algorithm can be available to
identification of user face, analysis of data, mapping data in between
environment data and information.
Abstract: A number of competing methodologies have been developed
to identify genes and classify DNA sequences into coding
and non-coding sequences. This classification process is fundamental
in gene finding and gene annotation tools and is one of the most
challenging tasks in bioinformatics and computational biology. An
information theory measure based on mutual information has shown
good accuracy in classifying DNA sequences into coding and noncoding.
In this paper we describe a species independent iterative
approach that distinguishes coding from non-coding sequences using
the mutual information measure (MIM). A set of sixty prokaryotes is
used to extract universal training data. To facilitate comparisons with
the published results of other researchers, a test set of 51 bacterial
and archaeal genomes was used to evaluate MIM. These results
demonstrate that MIM produces superior results while remaining
species independent.
Abstract: Atmospheric stability plays the most important role in
the transport and dispersion of air pollutants. Different methods are
used for stability determination with varying degrees of complexity.
Most of these methods are based on the relative magnitude of
convective and mechanical turbulence in atmospheric motions.
Richardson number, Monin-Obukhov length, Pasquill-Gifford
stability classification and Pasquill–Turner stability classification, are
the most common parameters and methods. The Pasquill–Turner
Method (PTM), which is employed in this study, makes use of
observations of wind speed, insolation and the time of day to classify
atmospheric stability with distinguishable indices. In this study, a
model is presented to determination of atmospheric stability
conditions using PTM. As a case study, meteorological data of
Mehrabad station in Tehran from 2000 to 2005 is applied to model.
Here, three different categories are considered to deduce the pattern
of stability conditions. First, the total pattern of stability classification
is obtained and results show that atmosphere is 38.77%, 27.26%,
33.97%, at stable, neutral and unstable condition, respectively. It is
also observed that days are mostly unstable (66.50%) while nights are
mostly stable (72.55%). Second, monthly and seasonal patterns are
derived and results indicate that relative frequency of stable
conditions decrease during January to June and increase during June
to December, while results for unstable conditions are exactly in
opposite manner. Autumn is the most stable season with relative
frequency of 50.69% for stable condition, whilst, it is 42.79%,
34.38% and 27.08% for winter, summer and spring, respectively.
Hourly stability pattern is the third category that points out that
unstable condition is dominant from approximately 03-15 GTM and
04-12 GTM for warm and cold seasons, respectively. Finally,
correlation between atmospheric stability and CO concentration is
achieved.
Abstract: In this study, a system of encryption based on chaotic
sequences is described. The system is used for encrypting digital
image data for the purpose of secure image transmission. An image
secure communication scheme based on Logistic map chaotic
sequences with a nonlinear function is proposed in this paper.
Encryption and decryption keys are obtained by one-dimensional
Logistic map that generates secret key for the input of the nonlinear
function. Receiver can recover the information using the received
signal and identical key sequences through the inverse system
technique. The results of computer simulations indicate that the
transmitted source image can be correctly and reliably recovered by
using proposed scheme even under the noisy channel. The
performance of the system will be discussed through evaluating the
quality of recovered image with and without channel noise.
Abstract: Optical character recognition of cursive scripts
presents a number of challenging problems in both segmentation and
recognition processes in different languages, including Persian. In
order to overcome these problems, we use a newly developed Persian
word segmentation method and a recognition-based segmentation
technique to overcome its segmentation problems. This method is
robust as well as flexible. It also increases the system-s tolerances to
font variations. The implementation results of this method on a
comprehensive database show a high degree of accuracy which meets
the requirements for commercial use. Extended with a suitable pre
and post-processing, the method offers a simple and fast framework
to develop a full OCR system.
Abstract: In this paper we present a photo mosaic smartphone
application in client-server based large-scale image databases. Photo
mosaic is not a new concept, but there are very few smartphone
applications especially for a huge number of images in the
client-server environment. To support large-scale image databases,
we first propose an overall framework working as a client-server
model. We then present a concept of image-PAA features to efficiently
handle a huge number of images and discuss its lower bounding
property. We also present a best-match algorithm that exploits the
lower bounding property of image-PAA. We finally implement an
efficient Android-based application and demonstrate its feasibility.
Abstract: This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to
fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach)
has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a
superior hybrid solution. Recent researches have shown that there is a
need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this
instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent
systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.
Abstract: In the current Grid environment, efficient workload
management presents a significant challenge, for which there are
exorbitant de facto standards encompassing resource discovery,
brokerage, and data transfer, among others. In addition, the real-time
resource status, essential for an optimal resource allocation strategy,
is often not readily accessible. To address these issues and provide a
cleaner abstraction of the Grid with the potential of generalizing into
arbitrary resource-sharing environment, this paper proposes a new
Condor-based pilot mechanism applied in the PanDA architecture,
PanDA-PF WMS, with the goal of providing a more generic yet
efficient resource allocating strategy. In this architecture, the PanDA
server primarily acts as a repository of user jobs, responding to pilot
requests from distributed, remote resources. Scheduling decisions are
subsequently made according to the real-time resource information
reported by pilots. Pilot Factory is a Condor-inspired solution for a
scalable pilot dissemination and effectively functions as a resource
provisioning mechanism through which the user-job server, PanDA,
reaches out to the candidate resources only on demand.
Abstract: Selecting the data modeling technique for an
information system is determined by the objective of the resultant
data model. Dimensional modeling is the preferred modeling
technique for data destined for data warehouses and data mining,
presenting data models that ease analysis and queries which are in
contrast with entity relationship modeling. The establishment of data
warehouses as components of information system landscapes in
many organizations has subsequently led to the development of
dimensional modeling. This has been significantly more developed
and reported for the commercial database management systems as
compared to the open sources thereby making it less affordable for
those in resource constrained settings. This paper presents
dimensional modeling of HIV patient information using open source
modeling tools. It aims to take advantage of the fact that the most
affected regions by the HIV virus are also heavily resource
constrained (sub-Saharan Africa) whereas having large quantities of
HIV data. Two HIV data source systems were studied to identify
appropriate dimensions and facts these were then modeled using two
open source dimensional modeling tools. Use of open source would
reduce the software costs for dimensional modeling and in turn make
data warehousing and data mining more feasible even for those in
resource constrained settings but with data available.
Abstract: Importance of environmental efficiency of electric power industry stems from high demand for energy combined with global warming concerns. It is especially essential for the world largest economies like that of the United States. The paper introduces a Data Envelopment Analysis (DEA) model of environmental efficiency using indicators of fossil fuels utilization, emissions rate, and electric power losses. Using DEA is advantageous in this situation over other approaches due to its nonparametric nature. The paper analyzes data for the period of 1990 - 2006 by comparing actual yearly levels in each dimension with the best values of partial indicators for the period. As positive factors of efficiency, tendency to the decline in emissions rates starting 2000, and in electric power losses starting 2004 may be mentioned together with increasing trend of fuel utilization starting 1999. As a result, dynamics of environmental efficiency is positive starting 2002. The main concern is the decline in fossil fuels utilization in 2006. This negative change should be reversed to comply with ecological and economic requirements.
Abstract: Recently, there are significant improvements in the
capabilities of mobile devices; rendering large terrain is tedious
because of the constraint in resources of mobile devices. This
paper focuses on the implementation of terrain rendering on
mobile device to observe some issues and current constraints
occurred. Experiments are performed using two datasets with
results based on rendering speed and appearance to ascertain both
the issues and constraints. The result shows a downfall of frame
rate performance because of the increase of triangles. Since the
resolution between computer and mobile device is different, the
terrain surface on mobile device looks more unrealistic compared
to on a computer. Thus, more attention in the development of
terrain rendering on mobile devices is required. The problems
highlighted in this paper will be the focus of future research and
will be a great importance for 3D visualization on mobile device.
Abstract: The aim of this paper is to examine factors related to system environment (namely, system quality and vendor support) that influences ERP implementation success in Iranian companies. Implementation success is identified using user satisfaction and organizational impact perspective. The study adopts the survey questionnaire approach to collect empirical data. The questionnaire was distributed to ERP users and a total of 384 responses were used for analysis. The results illustrated that both system quality and vendor support have significant effect on ERP implementation success. This implies that companies must ensure they source for the best available system and a vendor that is dependable, reliable and trustworthy.
Abstract: Avoidable unscheduled maintenance events and unnecessary
spare parts deliveries are mostly caused by an incorrect choice
of the underlying maintenance strategy. For a faster and more efficient
supply of spare parts for aircrafts of an airline we examine options for
improving the underlying logistics network integrated in an existing
aviation industry network. This paper presents a dynamic prediction
model as decision support for maintenance method selection considering
requirements of an entire flight network. The objective is
to guarantee a high supply of spare parts by an optimal interaction
of various network levels and thus to reduce unscheduled maintenance
events and minimize total costs. By using a prognostics-based
preventive maintenance strategy unscheduled component failures are
avoided for an increase in availability and reliability of the entire
system. The model is intended for use in an aviation company that
utilizes a structured planning process based on collected failures data
of components.
Abstract: Bond Graph as a unified multidisciplinary tool is widely
used not only for dynamic modelling but also for Fault Detection and
Isolation because of its structural and causal proprieties. A binary
Fault Signature Matrix is systematically generated but to make the
final binary decision is not always feasible because of the problems
revealed by such method. The purpose of this paper is introducing a
methodology for the improvement of the classical binary method of
decision-making, so that the unknown and identical failure signatures
can be treated to improve the robustness. This approach consists of
associating the evaluated residuals and the components reliability data
to build a Hybrid Bayesian Network. This network is used in two
distinct inference procedures: one for the continuous part and the
other for the discrete part. The continuous nodes of the network are
the prior probabilities of the components failures, which are used by
the inference procedure on the discrete part to compute the posterior
probabilities of the failures. The developed methodology is applied
to a real steam generator pilot process.