Abstract: Through 1980s, management accounting researchers
described the increasing irrelevance of traditional control and
performance measurement systems. The Balanced Scorecard (BSC)
is a critical business tool for a lot of organizations. It is a
performance measurement system which translates mission and
strategy into objectives. Strategy map approach is a development
variant of BSC in which some necessary causal relations must be
established. To recognize these relations, experts usually use
experience. It is also possible to utilize regression for the same
purpose. Structural Equation Modeling (SEM), which is one of the
most powerful methods of multivariate data analysis, obtains more
appropriate results than traditional methods such as regression. In the
present paper, we propose SEM for the first time to identify the
relations between objectives in the strategy map, and a test to
measure the importance of relations. In SEM, factor analysis and test
of hypotheses are done in the same analysis. SEM is known to be
better than other techniques at supporting analysis and reporting. Our
approach provides a framework which permits the experts to design
the strategy map by applying a comprehensive and scientific method
together with their experience. Therefore this scheme is a more
reliable method in comparison with the previously established
methods.
Abstract: As a structure for processing string problem, suffix
array is certainly widely-known and extensively-studied. But if the
string access pattern follows the “90/10" rule, suffix array can not take
advantage of the fact that we often find something that we have just
found. Although the splay tree is an efficient data structure for small
documents when the access pattern follows the “90/10" rule, it
requires many structures and an excessive amount of pointer
manipulations for efficiently processing and searching large
documents. In this paper, we propose a new and conceptually powerful
data structure, called splay suffix arrays (SSA), for string search. This
data structure combines the features of splay tree and suffix arrays into
a new approach which is suitable to implementation on both
conventional and clustered computers.
Abstract: The objective of this work which is based on the
approach of simultaneous engineering is to contribute to the
development of a CIM tool for the synthesis of functional design
dimensions expressed by average values and tolerance intervals. In
this paper, the dispersions method known as the Δl method which
proved reliable in the simulation of manufacturing dimensions is
used to develop a methodology for the automation of the simulation.
This methodology is constructed around three procedures. The first
procedure executes the verification of the functional requirements by
automatically extracting the functional dimension chains in the
mechanical sub-assembly. Then a second procedure performs an
optimization of the dispersions on the basis of unknown variables.
The third procedure uses the optimized values of the dispersions to
compute the optimized average values and tolerances of the
functional dimensions in the chains. A statistical and cost based
approach is integrated in the methodology in order to take account of
the capabilities of the manufacturing processes and to distribute
optimal values among the individual components of the chains.
Abstract: The increasing popularity of wireless technologies
and mobile computing devices has enabled new application areas and
research. One of these new areas is pervasive systems in urban
environments, because urban environments are characterized by high
concentration of these technologies and devices. In this paper we will
show the process of pervasive system design in urban environments,
using as use case a local zoo in Cali, Colombia. Based on an
ethnographic studio, we present the design of a pervasive system for
urban computing based on service oriented architecture to controlled
environment of Cali Zoo. In this paper, the reader will find a
methodological approach for the design of similar systems, using
data collection methods, conceptual frameworks for urban
environments and considerations of analysis and design of service
oriented systems.
Abstract: The mobile users with Laptops need to have an
efficient access to i.e. their home personal data or to the Internet from
any place in the world, regardless of their location or point of
attachment, especially while roaming outside the home subnet. An
efficient interpretation of packet losses problem that is encountered
from this roaming is to the centric of all aspects in this work, to be
over-highlighted. The main previous works, such as BER-systems,
Amigos, and ns-2 implementation that are considered to be in
conjunction with that problem under study are reviewed and
discussed. Their drawbacks and limitations, of stopping only at
monitoring, and not to provide an actual solution for eliminating or
even restricting these losses, are mentioned. Besides that, the
framework around which we built a Triple-R sequence as a costeffective
solution to eliminate the packet losses and bridge the gap
between subnets, an area that until now has been largely neglected, is
presented. The results show that, in addition to the high bit error rate
of wireless mobile networks, mainly the low efficiency of mobile-IP
registration procedure is a direct cause of these packet losses.
Furthermore, the output of packet losses interpretation resulted an
illustrated triangle of the registration process. This triangle should be
further researched and analyzed in our future work.
Abstract: The effect of the blade tip geometry of a high pressure
gas turbine is studied experimentally and computationally for high
speed leakage flows. For this purpose two simplified models are
constructed, one models a flat tip of the blade and the second models
a cavity tip of the blade. Experimental results are obtained from a
transonic wind tunnel to show the static pressure distribution along
the tip wall and provide flow visualization. RANS computations
were carried to provide further insight into the mean flow behavior
and to calculate the discharge coefficient which is a measure of the
flow leaking over the tip. It is shown that in both geometries of tip
the flow separates over the tip to form a separation bubble. The
bubble is higher for the cavity tip while a complete shock wave
system of oblique waves ending with a normal wave can be seen for
the flat tip. The discharge coefficient for the flat tip shows less
dependence on the pressure ratio over the blade tip than the cavity
tip. However, the discharge coefficient for the cavity tip is lower than
that of the flat tip, showing a better ability to reduce the leakage flow
and thus increase the turbine efficiency.
Abstract: Atmospheric stability plays the most important role in
the transport and dispersion of air pollutants. Different methods are
used for stability determination with varying degrees of complexity.
Most of these methods are based on the relative magnitude of
convective and mechanical turbulence in atmospheric motions.
Richardson number, Monin-Obukhov length, Pasquill-Gifford
stability classification and Pasquill–Turner stability classification, are
the most common parameters and methods. The Pasquill–Turner
Method (PTM), which is employed in this study, makes use of
observations of wind speed, insolation and the time of day to classify
atmospheric stability with distinguishable indices. In this study, a
model is presented to determination of atmospheric stability
conditions using PTM. As a case study, meteorological data of
Mehrabad station in Tehran from 2000 to 2005 is applied to model.
Here, three different categories are considered to deduce the pattern
of stability conditions. First, the total pattern of stability classification
is obtained and results show that atmosphere is 38.77%, 27.26%,
33.97%, at stable, neutral and unstable condition, respectively. It is
also observed that days are mostly unstable (66.50%) while nights are
mostly stable (72.55%). Second, monthly and seasonal patterns are
derived and results indicate that relative frequency of stable
conditions decrease during January to June and increase during June
to December, while results for unstable conditions are exactly in
opposite manner. Autumn is the most stable season with relative
frequency of 50.69% for stable condition, whilst, it is 42.79%,
34.38% and 27.08% for winter, summer and spring, respectively.
Hourly stability pattern is the third category that points out that
unstable condition is dominant from approximately 03-15 GTM and
04-12 GTM for warm and cold seasons, respectively. Finally,
correlation between atmospheric stability and CO concentration is
achieved.
Abstract: This experiment discusses the effects of fracture
parameters such as depth, length, width, angle and the number of the
fracture to the conductance properties of laterite using the DUK-2B
digital electrical measurement system combined with the method of
simulating the fractures. The results of experiment show that the
changes of fracture parameters produce effects to the conductance
properties of laterite. There is a clear degressive period of the
conductivity of laterite during increasing the depth, length, width, or
the angle and the quantity of fracture gradually. When the depth of
fracture exceeds the half thickness of the soil body, the conductivity of
laterite shows evidently non-linear diminishing pattern and the
amplitude of decrease tends to increase. The length of fracture has
fewer effects than the depth to the conductivity. When the width of
fracture reaches some fixed values, the change of the conductivity is
less sensitive to the change of the width, and at this time, the
conductivity of laterite maintains at a stable level. When the angle of
fracture is less than 45°, the decrease of the conductivity is more
clearly as the angle increases. But when angle is more than 45°,
change of the conductivity is relatively gentle as the angle increases.
The increasing quantity of the fracture causes the other fracture
parameters having great impact on the change of conductivity. When
moisture content and temperature were unchanged, depth and angle of
fractures are the major factors affecting the conductivity of laterite
soil; quantity, length, and width are minor influencing factors. The
sensitivity of fracture parameters affect conductivity of laterite soil is:
depth >angles >quantity >length >width.
Abstract: Optical character recognition of cursive scripts
presents a number of challenging problems in both segmentation and
recognition processes in different languages, including Persian. In
order to overcome these problems, we use a newly developed Persian
word segmentation method and a recognition-based segmentation
technique to overcome its segmentation problems. This method is
robust as well as flexible. It also increases the system-s tolerances to
font variations. The implementation results of this method on a
comprehensive database show a high degree of accuracy which meets
the requirements for commercial use. Extended with a suitable pre
and post-processing, the method offers a simple and fast framework
to develop a full OCR system.
Abstract: Tourism industries are rapidly increased for the last
few years especially in Malaysia. In order to attract more tourists,
Malaysian Governance encourages any effort to increase Malaysian
tourism industry. One of the efforts in attracting more tourists in
Malacca, Malaysia is a duck tour. Duck tour is an amphibious
sightseeing tour that works in two types of engines, hence, it required
a huge cost to operate and maintain the vehicle. To other country, it is
not so new but in Malaysia, it is just introduced, thus it does not have
any systematic routing yet. Therefore, this paper proposed an
optimization technique to formulate and schedule this tour to
minimize the operating costs by considering it into Travelling
Salesman Problem (TSP). The problem is then can be solved by one
of the optimization technique especially meta-heuristics approach
such as Tabu Search (TS) and Reactive Tabu Search (RTS).
Abstract: This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to
fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach)
has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a
superior hybrid solution. Recent researches have shown that there is a
need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this
instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent
systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.
Abstract: Ethical Education is a compulsorily optional subject in
primary and secondary schools. The Ethical Education objective is
the education of a personality with one´s own identity, with
interiorized ethical standards, with mature moral judgement and
therefore with the behaviour determined by one´s own beliefs; with a
positive attitude to himself/herself and other people and that is why
he/she is able to cooperate and to initiate cooperation. In the paper we
describe the contents and the principles of Ethical education. We also
shows that Ethical education is subject supported primary socialpathological
prevention and education to citizenship. In this context
we try to show that ethical education contributes to the education of
good people who are aware of the necessity to respect social norms
and are able to assume responsibility for their own behaviour in any
situation at present and in the future.
Abstract: In the current Grid environment, efficient workload
management presents a significant challenge, for which there are
exorbitant de facto standards encompassing resource discovery,
brokerage, and data transfer, among others. In addition, the real-time
resource status, essential for an optimal resource allocation strategy,
is often not readily accessible. To address these issues and provide a
cleaner abstraction of the Grid with the potential of generalizing into
arbitrary resource-sharing environment, this paper proposes a new
Condor-based pilot mechanism applied in the PanDA architecture,
PanDA-PF WMS, with the goal of providing a more generic yet
efficient resource allocating strategy. In this architecture, the PanDA
server primarily acts as a repository of user jobs, responding to pilot
requests from distributed, remote resources. Scheduling decisions are
subsequently made according to the real-time resource information
reported by pilots. Pilot Factory is a Condor-inspired solution for a
scalable pilot dissemination and effectively functions as a resource
provisioning mechanism through which the user-job server, PanDA,
reaches out to the candidate resources only on demand.
Abstract: Selecting the data modeling technique for an
information system is determined by the objective of the resultant
data model. Dimensional modeling is the preferred modeling
technique for data destined for data warehouses and data mining,
presenting data models that ease analysis and queries which are in
contrast with entity relationship modeling. The establishment of data
warehouses as components of information system landscapes in
many organizations has subsequently led to the development of
dimensional modeling. This has been significantly more developed
and reported for the commercial database management systems as
compared to the open sources thereby making it less affordable for
those in resource constrained settings. This paper presents
dimensional modeling of HIV patient information using open source
modeling tools. It aims to take advantage of the fact that the most
affected regions by the HIV virus are also heavily resource
constrained (sub-Saharan Africa) whereas having large quantities of
HIV data. Two HIV data source systems were studied to identify
appropriate dimensions and facts these were then modeled using two
open source dimensional modeling tools. Use of open source would
reduce the software costs for dimensional modeling and in turn make
data warehousing and data mining more feasible even for those in
resource constrained settings but with data available.
Abstract: Energy and exergy study of air-water combined solar collector which is called dual purpose solar collector (DPSC) is investigated. The method of ε - NTU is used. Analysis is performed for triangle channels. Parameters like the air flow rate and water inlet temperature are studied. Results are shown that DPSC has better energy and exergy efficiency than single collector. In addition, the triangle passage with water inlet temperature of 60O C has shown better exergy and energy efficiency.
Abstract: In this paper we have proposed three and two
stage still gray scale image compressor based on BTC. In our
schemes, we have employed a combination of four techniques
to reduce the bit rate. They are quad tree segmentation, bit
plane omission, bit plane coding using 32 visual patterns and
interpolative bit plane coding. The experimental results show
that the proposed schemes achieve an average bit rate of 0.46
bits per pixel (bpp) for standard gray scale images with an
average PSNR value of 30.25, which is better than the results
from the exiting similar methods based on BTC.
Abstract: During last decades is widely discussed the
international harmonization of financial reporting. This
harmonization is also affected by national tax systems in analyzed
countries. This paper provides some evidence on current national tax
systems in selected countries in Central and Eastern Europe. The
linkage of accounting profit as a tax base might decrease the
administrative burden for majority of SMEs, which are the most
important engine of each national economy.
Abstract: e-mail has become an important means of electronic
communication but the viability of its usage is marred by Unsolicited
Bulk e-mail (UBE) messages. UBE consists of many types
like pornographic, virus infected and 'cry-for-help' messages as well
as fake and fraudulent offers for jobs, winnings and medicines. UBE
poses technical and socio-economic challenges to usage of e-mails.
To meet this challenge and combat this menace, we need to
understand UBE. Towards this end, the current paper presents a
content-based textual analysis of more than 2700 body enhancement
medicinal UBE. Technically, this is an application of Text Parsing
and Tokenization for an un-structured textual document and we
approach it using Bag Of Words (BOW) and Vector Space Document
Model techniques. We have attempted to identify the most
frequently occurring lexis in the UBE documents that advertise
various products for body enhancement. The analysis of such top
100 lexis is also presented. We exhibit the relationship between
occurrence of a word from the identified lexis-set in the given UBE
and the probability that the given UBE will be the one advertising for
fake medicinal product. To the best of our knowledge and survey of
related literature, this is the first formal attempt for identification of
most frequently occurring lexis in such UBE by its textual analysis.
Finally, this is a sincere attempt to bring about alertness against and
mitigate the threat of such luring but fake UBE.
Abstract: Bond Graph as a unified multidisciplinary tool is widely
used not only for dynamic modelling but also for Fault Detection and
Isolation because of its structural and causal proprieties. A binary
Fault Signature Matrix is systematically generated but to make the
final binary decision is not always feasible because of the problems
revealed by such method. The purpose of this paper is introducing a
methodology for the improvement of the classical binary method of
decision-making, so that the unknown and identical failure signatures
can be treated to improve the robustness. This approach consists of
associating the evaluated residuals and the components reliability data
to build a Hybrid Bayesian Network. This network is used in two
distinct inference procedures: one for the continuous part and the
other for the discrete part. The continuous nodes of the network are
the prior probabilities of the components failures, which are used by
the inference procedure on the discrete part to compute the posterior
probabilities of the failures. The developed methodology is applied
to a real steam generator pilot process.
Abstract: Eight heavy metals (Cu, Cr, Zn, Hg, Pb, Cd, Ni and As) were analyzed in sediment samples in the dry and wet seasons from November 2009 to October 2010 in West Port of Peninsular Malaysia. The heavy metal concentrations (mg/kg dry weight) were ranged from 23.4 to 98.3 for Zn, 22.3 to 80 for Pb, 7.4 to 27.6 Cu, 0.244 to 3.53 for Cd, 7.2 to 22.2 for Ni, 20.2 to 162 for As, 0.11 to 0.409 for Hg and 11.5 to 61.5 for Cr. Metals concentrations in dry season were higher than the rainy season except in cupper and chromium. Analysis of variance with Statistical Analysis System (SAS) shows that the mean concentration of metals in the two seasons (α level=0.05) are not significantly different which shows that the metals were held firmly in the matrix of sediment. Also there are significant differences between control point station with other stations. According to the Interim Sediment Quality guidelines (ISQG), the metal concentrations are moderately polluted, except in arsenic which shows the highest level of pollution.