Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: Information Technology (IT) projects are always
accompanied by various risks and because of high rate of failure in
such projects, managing risks in order to neutralize or at least
decrease their effects on the success of the project is strongly
essential. In this paper, fuzzy analytical hierarchy process (FAHP) is
exploited as a means of risk evaluation methodology to prioritize and
organize risk factors faced in IT projects. A real case of IT projects, a
project of design and implementation of an integrated information
system in a vehicle producing company in Iran is studied. Related
risk factors are identified and then expert qualitative judgments about
these factors are acquired. Translating these judgments to fuzzy
numbers and using them as an input to FAHP, risk factors are then
ranked and prioritized by FAHP in order to make project managers
aware of more important risks and enable them to adopt suitable
measures to deal with these highly devastative risks.
Abstract: Lignocellulosic materials are new targeted source to
produce second generation biofuels like biobutanol. However, this
process is significantly resisted by the native structure of biomass.
Therefore, pretreatment process is always essential to remove
hemicelluloses and lignin prior to the enzymatic hydrolysis.
The goals of pretreatment are removing hemicelluloses and
lignin, increasing biomass porosity, and increasing the enzyme
accessibility. The main goal of this research is to study the important
variables such as pretreatment temperature and time, which can give
the highest total sugar yield in pretreatment step by using dilute
phosphoric acid. After pretreatment, the highest total sugar yield of
13.61 g/L was obtained under an optimal condition at 140°C for 10
min of pretreatment time by using 1.75% (w/w) H3PO4 and at 15:1
liquid to solid ratio. The total sugar yield of two-stage process
(pretreatment+enzymatic hydrolysis) of 27.38 g/L was obtained.
Abstract: Resource-constrained project scheduling is an NPhard
optimisation problem. There are many different heuristic
strategies how to shift activities in time when resource requirements
exceed their available amounts. These strategies are frequently based
on priorities of activities. In this paper, we assume that a suitable
heuristic has been chosen to decide which activities should be
performed immediately and which should be postponed and
investigate the resource-constrained project scheduling problem
(RCPSP) from the implementation point of view. We propose an
efficient routine that, instead of shifting the activities, extends their
duration. It makes it possible to break down their duration into active
and sleeping subintervals. Then we can apply the classical Critical
Path Method that needs only polynomial running time. This
algorithm can simply be adapted for multiproject scheduling with
limited resources.
Abstract: Although there are many theories and discussion of leadership, the necessity of having a new leadership paradigm was emphasized. The existing leadership characteristic of instruction and control revealed its limitations. Market competition becomes fierce and economic recession never ends worldwide. Of the leadership theories, servant leadership was introduced recently and is in line with the environmental changes of the organization. Servant leadership is a combination of two words, 'servant' and 'leader' and can be defined as the role of the leader who focuses on doing voluntary work for others with altruistic ethics, makes members, customers, and local communities a priority, and makes a commitment to satisfying their needs. This leadership received attention as one field of leadership in the late 1990s and secured its legitimacy. This study discusses the existing research trends of leadership, the concept, behavior characteristics, and lower dimensions of servant leadership, compares servant leadership with the existing leadership researches and diagnoses if servant leadership is a useful concept for further leadership researches. Finally, this study criticizes the limitations in the existing researches on servant leadership.
Abstract: This study aims to explore the differences and
similarities in perceptions of affective climate antecedents at the
workplace (intimacy, flexibility, employment stability, and team)
among Japanese and Thai Generations X and Y. The samples in this
study were Thai and Japanese workers who completed a work
environment questionnaire and provided demographic information.
Generational differences in perceptions (beliefs) of what factors
contribute to affective climate were investigated using t-test analysis.
Mean scores for each antecedent were ranked to determine how each
generation in each group prioritized the importance of all affective
climate antecedents. Japanese Generation Y perceived the importance
of employment stability for affective climate of their workplaces to be
significantly higher than did Japanese Generation X. Thai Generation
Y considered flexibility with a higher priority than did Thai
Generation X. Intimacy was perceived as highly important across
generations and countries in regard to affective climate. Results
suggest that managers should design workplaces for a mixture of
diverse generations, resulting in a better affective climate. Differences
in the importance of antecedents for affective climate among
Generations X and Y in two countries were clarified. In addition,
different preferences regarding work environment across Japanese
Generations X and Y and Thai Generations X and Y were discussed.
Abstract: Extracting in-play scenes in sport videos is essential for
quantitative analysis and effective video browsing of the sport
activities. Game analysis of badminton as of the other racket sports
requires detecting the start and end of each rally period in an
automated manner. This paper describes an automatic serve scene
detection method employing cubic higher-order local auto-correlation
(CHLAC) and multiple regression analysis (MRA). CHLAC can
extract features of postures and motions of multiple persons without
segmenting and tracking each person by virtue of shift-invariance and
additivity, and necessitate no prior knowledge. Then, the specific
scenes, such as serve, are detected by linear regression (MRA) from
the CHLAC features. To demonstrate the effectiveness of our method,
the experiment was conducted on video sequences of five badminton
matches captured by a single ceiling camera. The averaged precision
and recall rates for the serve scene detection were 95.1% and 96.3%,
respectively.
Abstract: The self-organizing map (SOM) model is a well-known neural network model with wide spread of applications. The main characteristics of SOM are two-fold, namely dimension reduction and topology preservation. Using SOM, a high-dimensional data space will be mapped to some low-dimensional space. Meanwhile, the topological relations among data will be preserved. With such characteristics, the SOM was usually applied on data clustering and visualization tasks. However, the SOM has main disadvantage of the need to know the number and structure of neurons prior to training, which are difficult to be determined. Several schemes have been proposed to tackle such deficiency. Examples are growing/expandable SOM, hierarchical SOM, and growing hierarchical SOM. These schemes could dynamically expand the map, even generate hierarchical maps, during training. Encouraging results were reported. Basically, these schemes adapt the size and structure of the map according to the distribution of training data. That is, they are data-driven or dataoriented SOM schemes. In this work, a topic-oriented SOM scheme which is suitable for document clustering and organization will be developed. The proposed SOM will automatically adapt the number as well as the structure of the map according to identified topics. Unlike other data-oriented SOMs, our approach expands the map and generates the hierarchies both according to the topics and their characteristics of the neurons. The preliminary experiments give promising result and demonstrate the plausibility of the method.
Abstract: In order to develop forest management strategies in
tropical forest in Malaysia, surveying the forest resources and
monitoring the forest area affected by logging activities is essential.
There are tremendous effort has been done in classification of land
cover related to forest resource management in this country as it is a
priority in all aspects of forest mapping using remote sensing and
related technology such as GIS. In fact classification process is a
compulsory step in any remote sensing research. Therefore, the main
objective of this paper is to assess classification accuracy of
classified forest map on Landsat TM data from difference number of
reference data (200 and 388 reference data). This comparison was
made through observation (200 reference data), and interpretation
and observation approaches (388 reference data). Five land cover
classes namely primary forest, logged over forest, water bodies, bare
land and agricultural crop/mixed horticultural can be identified by
the differences in spectral wavelength. Result showed that an overall
accuracy from 200 reference data was 83.5 % (kappa value
0.7502459; kappa variance 0.002871), which was considered
acceptable or good for optical data. However, when 200 reference
data was increased to 388 in the confusion matrix, the accuracy
slightly improved from 83.5% to 89.17%, with Kappa statistic
increased from 0.7502459 to 0.8026135, respectively. The accuracy
in this classification suggested that this strategy for the selection of
training area, interpretation approaches and number of reference data
used were importance to perform better classification result.
Abstract: We constructed a method of phase unwrapping for a typical wave-front by utilizing the maximizer of the posterior marginal (MPM) estimate corresponding to equilibrium statistical mechanics of the three-state Ising model on a square lattice on the basis of an analogy between statistical mechanics and Bayesian inference. We investigated the static properties of an MPM estimate from a phase diagram using Monte Carlo simulation for a typical wave-front with synthetic aperture radar (SAR) interferometry. The simulations clarified that the surface-consistency conditions were useful for extending the phase where the MPM estimate was successful in phase unwrapping with a high degree of accuracy and that introducing prior information into the MPM estimate also made it possible to extend the phase under the constraint of the surface-consistency conditions with a high degree of accuracy. We also found that the MPM estimate could be used to reconstruct the original wave-fronts more smoothly, if we appropriately tuned hyper-parameters corresponding to temperature to utilize fluctuations around the MAP solution. Also, from the viewpoint of statistical mechanics of the Q-Ising model, we found that the MPM estimate was regarded as a method for searching the ground state by utilizing thermal fluctuations under the constraint of the surface-consistency condition.
Abstract: Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.
Abstract: This research proposes a Preemptive Possibilistic
Linear Programming (PPLP) approach for solving multiobjective
Aggregate Production Planning (APP) problem with interval demand
and imprecise unit price and related operating costs. The proposed
approach attempts to maximize profit and minimize changes of
workforce. It transforms the total profit objective that has imprecise
information to three crisp objective functions, which are maximizing
the most possible value of profit, minimizing the risk of obtaining the
lower profit and maximizing the opportunity of obtaining the higher
profit. The change of workforce level objective is also converted.
Then, the problem is solved according to objective priorities. It is
easier than simultaneously solve the multiobjective problem as
performed in existing approach. Possible range of interval demand is
also used to increase flexibility of obtaining the better production
plan. A practical application of an electronic company is illustrated to
show the effectiveness of the proposed model.
Abstract: A challenging problem in radar signal processing is to
achieve reliable target detection in the presence of interferences. In
this paper, we propose a novel algorithm for automatic censoring of
radar interfering targets in log-normal clutter. The proposed
algorithm, termed the forward automatic censored cell averaging
detector (F-ACCAD), consists of two steps: removing the corrupted
reference cells (censoring) and the actual detection. Both steps are
performed dynamically by using a suitable set of ranked cells to
estimate the unknown background level and set the adaptive
thresholds accordingly. The F-ACCAD algorithm does not require
any prior information about the clutter parameters nor does it require
the number of interfering targets. The effectiveness of the F-ACCAD
algorithm is assessed by computing, using Monte Carlo simulations,
the probability of censoring and the probability of detection in
different background environments.
Abstract: Throughput is an important measure of performance of production system. Analyzing and modeling of production throughput is complex in today-s dynamic production systems due to uncertainties of production system. The main reasons are that uncertainties are materialized when the production line faces changes in setup time, machinery break down, lead time of manufacturing, and scraps. Besides, demand changes are fluctuating from time to time for each product type. These uncertainties affect the production performance. This paper proposes Bayesian inference for throughput modeling under five production uncertainties. Bayesian model utilized prior distributions related to previous information about the uncertainties where likelihood distributions are associated to the observed data. Gibbs sampling algorithm as the robust procedure of Monte Carlo Markov chain was employed for sampling unknown parameters and estimating the posterior mean of uncertainties. The Bayesian model was validated with respect to convergence and efficiency of its outputs. The results presented that the proposed Bayesian models were capable to predict the production throughput with accuracy of 98.3%.
Abstract: The development of Internet technology in recent years has led to a more active role of users in creating Web content. This has significant effects both on individual learning and collaborative knowledge building. This paper will present an integrative framework model to describe and explain learning and knowledge building with shared digital artifacts on the basis of Luhmann-s systems theory and Piaget-s model of equilibration. In this model, knowledge progress is based on cognitive conflicts resulting from incongruities between an individual-s prior knowledge and the information which is contained in a digital artifact. Empirical support for the model will be provided by 1) applying it descriptively to texts from Wikipedia, 2) examining knowledge-building processes using a social network analysis, and 3) presenting a survey of a series of experimental laboratory studies.
Abstract: Demand of energy is increasing faster than the
generation. It leads shortage of power in all sectors of society. At
peak hours this shortage is higher. Unless we utilize energy efficient
technology, it is very difficult to minimize the shortage of energy. So
energy efficiency program and energy conservation has an important
role. Energy efficient technologies are cost intensive hence it is
always not possible to implement in country like India. In the recent
study, an educational building with operating hours from 10:00 a.m.
to 05:00 p.m. has been selected to quantify the possibility of lighting
energy conservation. As the operating hour is in daytime, integration
of daylight with artificial lighting system will definitely reduce the
lighting energy consumption. Moreover the initial investment has
been given priority and hence the existing lighting installation was
unaltered. An automatic controller has been designed which will be
operated as a function of daylight through windows and the lighting
system of the room will function accordingly. The result of the study
of integrating daylight gave quite satisfactory for visual comfort as
well as energy conservation.
Abstract: In today's world where everything is rapidly changing
and information technology is high in development, many features of culture, society, politic and economy has changed. The advent of
information technology and electronic data transmission lead to easy communication and fields like e-learning and e-commerce, are
accessible for everyone easily. One of these technologies is virtual
training. The "quality" of such kind of education systems is critical. 131 questionnaires were prepared and distributed among university
student in Toba University. So the research has followed factors that affect the quality of learning from the perspective of staff, students, professors and this type of university. It is concluded that the important factors in virtual training are the quality of professors, the
quality of staff, and the quality of the university. These mentioned factors were the most prior factors in this education system and
necessary for improving virtual training.
Abstract: In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.
Abstract: Available Bit Rate Service (ABR) is the lower priority
service and the better service for the transmission of data. On wireline
ATM networks ABR source is always getting the feedback from
switches about increase or decrease of bandwidth according to the
changing network conditions and minimum bandwidth is guaranteed.
In wireless networks guaranteeing the minimum bandwidth is really a
challenging task as the source is always in mobile and traveling from
one cell to another cell. Re establishment of virtual circuits from start
to end every time causes the delay in transmission. In our proposed
solution we proposed the mechanism to provide more available
bandwidth to the ABR source by re-usage of part of old Virtual
Channels and establishing the new ones. We want the ABR source to
transmit the data continuously (non-stop) inorderto avoid the delay.
In worst case scenario at least minimum bandwidth is to be allocated.
In order to keep the data flow continuously, priority is given to the
handoff ABR call against new ABR call.
Abstract: In this paper, we present user pattern learning
algorithm based MDSS (Medical Decision support system) under
ubiquitous. Most of researches are focus on hardware system, hospital
management and whole concept of ubiquitous environment even
though it is hard to implement. Our objective of this paper is to design
a MDSS framework. It helps to patient for medical treatment and
prevention of the high risk patient (COPD, heart disease, Diabetes).
This framework consist database, CAD (Computer Aided diagnosis
support system) and CAP (computer aided user vital sign prediction
system). It can be applied to develop user pattern learning algorithm
based MDSS for homecare and silver town service. Especially this
CAD has wise decision making competency. It compares current vital
sign with user-s normal condition pattern data. In addition, the CAP
computes user vital sign prediction using past data of the patient. The
novel approach is using neural network method, wireless vital sign
acquisition devices and personal computer DB system. An intelligent
agent based MDSS will help elder people and high risk patients to
prevent sudden death and disease, the physician to get the online
access to patients- data, the plan of medication service priority (e.g.
emergency case).