Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: Reverse engineering of full-genomic interaction networks based on compendia of expression data has been successfully applied for a number of model organisms. This study adapts these approaches for an important non-model organism: The major human fungal pathogen Candida albicans. During the infection process, the pathogen can adapt to a wide range of environmental niches and reversibly changes its growth form. Given the importance of these processes, it is important to know how they are regulated. This study presents a reverse engineering strategy able to infer fullgenomic interaction networks for C. albicans based on a linear regression, utilizing the sparseness criterion (LASSO). To overcome the limited amount of expression data and small number of known interactions, we utilize different prior-knowledge sources guiding the network inference to a knowledge driven solution. Since, no database of known interactions for C. albicans exists, we use a textmining system which utilizes full-text research papers to identify known regulatory interactions. By comparing with these known regulatory interactions, we find an optimal value for global modelling parameters weighting the influence of the sparseness criterion and the prior-knowledge. Furthermore, we show that soft integration of prior-knowledge additionally improves the performance. Finally, we compare the performance of our approach to state of the art network inference approaches.
Abstract: In the current economy of increasing global
competition, many organizations are attempting to use knowledge as
one of the means to gain sustainable competitive advantage. Besides
large organizations, the success of SMEs can be linked to how well
they manage their knowledge. Despite the profusion of research
about knowledge management within large organizations, fewer
studies tried to analyze KM in SMEs.
This research proposes a new framework showing the determinant
role of organizational dimensions onto KM approaches. The paper
and its propositions are based on a literature review and analysis.
In this research, personalization versus codification,
individualization versus institutionalization and IT-based versus non
IT-based are highlighted as three distinct dimensions of knowledge
management approaches.
The study contributes to research by providing a more nuanced
classification of KM approaches and provides guidance to managers
about the types of KM approaches that should be adopted based on
the size, geographical dispersion and task nature of SMEs.
To the author-s knowledge, the paper is the first of its kind to
examine if there are suitable configurations of KM approaches for
SMEs with different dimensions. It gives valuable information, which
hopefully will help SME sector to accomplish KM.
Abstract: An autonomous environmental monitoring system
(Smart Landfill) has been constructed for the quantitative
measurement of the components of landfill gas found at borehole
wells at the perimeter of landfill sites. The main components of
landfill gas are the greenhouse gases, methane and carbon dioxide
and have been monitored in the range 0-5 % volume. This monitoring
system has not only been tested in the laboratory but has been
deployed in multiple field trials and the data collected successfully
compared with on-site monitors. This success shows the potential of
this system for application in environments where reliable gas
monitoring is crucial.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: This paper adopts a notion of expectation-perception
gap of systems users as information systems (IS) failure. Problems
leading to the expectation-perception gap are identified and modelled
as five interrelated discrepancies or gaps throughout the process of
information systems development (ISD). It describes an empirical
study on how systems developers and users perceive the size of each
gap and the extent to which each problematic issue contributes to the
gap. The key to achieving success in ISD is to keep the expectationperception
gap closed by closing all 5 pertaining gaps. The gap model
suggests that most factors in IS failure are related to organizational,
cognitive and social aspects of information systems design.
Organization requirement analysis, being the weakest link of IS
development, is particularly worthy of investigation.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: Meeting users- requirements is one of predictors of project success. There should be a match between the expectations of the users and the perception of key project personnel with respect to usability and functionality. The aim of this study is to make a comparison of key project personnel-s and potential users- (customer representatives) evaluations of the relative importance of usability and functionality factors in a software design project. Analytical Network Process (ANP) was used to analyze the relative importance of the factors. The results show that navigation and interaction are the most significant factors,andsatisfaction and efficiency are the least important factors for both groups. Further, it can be concluded that having similar orders and scores of usability and functionality factors for both groups shows that key project personnel have captured the expectations and requirements of potential users accurately.
Abstract: We review a knowledge extractor model in
constructing 3G Killer Applications. The success of 3G is essential
for Government as it became part of Telecommunications National
Strategy. The 3G wireless technologies may reach larger area and
increase country-s ICT penetration. In order to understand future
customers needs, the operators require proper information
(knowledge) lying inside. Our work approached future customers as
complex system where the complex knowledge may expose regular
behavior. The hidden information from 3G future customers is
revealed by using fractal-based questionnaires. Afterward, further
statistical analysis is used to match the results with operator-s
strategic plan. The developments of 3G applications also consider its
saturation time and further improvement of the application.
Abstract: As more people from non-technical backgrounds
are becoming directly involved with large-scale ontology
development, the focal point of ontology research has shifted
from the more theoretical ontology issues to problems
associated with the actual use of ontologies in real-world,
large-scale collaborative applications. Recently the National
Science Foundation funded a large collaborative ontology
development project for which a new formal ontology model,
the Ontology Abstract Machine (OAM), was developed to
satisfy some unique functional and data representation
requirements. This paper introduces the OAM model and the
related algorithms that enable maintenance of an ontology that
supports node-based user access. The successful software
implementation of the OAM model and its subsequent
acceptance by a large research community proves its validity
and its real-world application value.
Abstract: Information Technology (IT) projects are always
accompanied by various risks and because of high rate of failure in
such projects, managing risks in order to neutralize or at least
decrease their effects on the success of the project is strongly
essential. In this paper, fuzzy analytical hierarchy process (FAHP) is
exploited as a means of risk evaluation methodology to prioritize and
organize risk factors faced in IT projects. A real case of IT projects, a
project of design and implementation of an integrated information
system in a vehicle producing company in Iran is studied. Related
risk factors are identified and then expert qualitative judgments about
these factors are acquired. Translating these judgments to fuzzy
numbers and using them as an input to FAHP, risk factors are then
ranked and prioritized by FAHP in order to make project managers
aware of more important risks and enable them to adopt suitable
measures to deal with these highly devastative risks.
Abstract: The paper presented a transient population dynamics of phase singularities in 2D Beeler-Reuter model. Two stochastic modelings are examined: (i) the Master equation approach with the transition rate (i.e., λ(n, t) = λ(t)n and μ(n, t) = μ(t)n) and (ii) the nonlinear Langevin equation approach with a multiplicative noise. The exact general solution of the Master equation with arbitrary time-dependent transition rate is given. Then, the exact solution of the mean field equation for the nonlinear Langevin equation is also given. It is demonstrated that transient population dynamics is successfully identified by the generalized Logistic equation with fractional higher order nonlinear term. It is also demonstrated the necessity of introducing time-dependent transition rate in the master equation approach to incorporate the effect of nonlinearity.
Abstract: This paper presents a novel approach for representing
the spatio-temporal topology of the camera network with overlapping
and non-overlapping fields of view (FOVs). The topology is
determined by tracking moving objects and establishing object
correspondence across multiple cameras. To track people successfully
in multiple camera views, we used the Merge-Split (MS) approach for
object occlusion in a single camera and the grid-based approach for
extracting the accurate object feature. In addition, we considered the
appearance of people and the transition time between entry and exit
zones for tracking objects across blind regions of multiple cameras
with non-overlapping FOVs. The main contribution of this paper is to
estimate transition times between various entry and exit zones, and to
graphically represent the camera topology as an undirected weighted
graph using the transition probabilities.
Abstract: Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Abstract: Decisions are regularly made during a project or
daily life. Some decisions are critical and have a direct impact on
project or human success. Formal evaluation is thus required,
especially for crucial decisions, to arrive at the optimal solution
among alternatives to address issues. According to microeconomic
theory, all people-s decisions can be modeled as indifference curves.
The proposed approach supports formal analysis and decision by
constructing indifference curve model from the previous experts-
decision criteria. These knowledge embedded in the system can be
reused or help naïve users select alternative solution of the similar
problem. Moreover, the method is flexible to cope with unlimited
number of factors influencing the decision-making. The preliminary
experimental results of the alternative selection are accurately
matched with the expert-s decisions.
Abstract: User-based Collaborative filtering (CF), one of the
most prevailing and efficient recommendation techniques, provides
personalized recommendations to users based on the opinions of other
users. Although the CF technique has been successfully applied in
various applications, it suffers from serious sparsity problems. The
cloud-model approach addresses the sparsity problems by
constructing the user-s global preference represented by a cloud
eigenvector. The user-based CF approach works well with dense
datasets while the cloud-model CF approach has a greater
performance when the dataset is sparse. In this paper, we present a
hybrid approach that integrates the predictions from both the
user-based CF and the cloud-model CF approaches. The experimental
results show that the proposed hybrid approach can ameliorate the
sparsity problem and provide an improved prediction quality.
Abstract: The Spiral development model has been used
successfully in many commercial systems and in a good number of
defense systems. This is due to the fact that cost-effective
incremental commitment of funds, via an analogy of the spiral model
to stud poker and also can be used to develop hardware or integrate
software, hardware, and systems. To support adaptive, semantic
collaboration between domain experts and knowledge engineers, a
new knowledge engineering process, called Spiral_OWL is proposed.
This model is based on the idea of iterative refinement, annotation
and structuring of knowledge base. The Spiral_OWL model is
generated base on spiral model and knowledge engineering
methodology. A central paradigm for Spiral_OWL model is the
concentration on risk-driven determination of knowledge engineering
process. The collaboration aspect comes into play during knowledge
acquisition and knowledge validation phase. Design rationales for the
Spiral_OWL model are to be easy-to-implement, well-organized, and
iterative development cycle as an expanding spiral.
Abstract: Flow movement in unsaturated soil can be expressed
by a partial differential equation, named Richards equation. The
objective of this study is the finding of an appropriate implicit
numerical solution for head based Richards equation. Some of the
well known finite difference schemes (fully implicit, Crank Nicolson
and Runge-Kutta) have been utilized in this study. In addition, the
effects of different approximations of moisture capacity function,
convergence criteria and time stepping methods were evaluated. Two
different infiltration problems were solved to investigate the
performance of different schemes. These problems include of vertical
water flow in a wet and very dry soils. The numerical solutions of
two problems were compared using four evaluation criteria and the
results of comparisons showed that fully implicit scheme is better
than the other schemes. In addition, utilizing of standard chord slope
method for approximation of moisture capacity function, automatic
time stepping method and difference between two successive
iterations as convergence criterion in the fully implicit scheme can
lead to better and more reliable results for simulation of fluid
movement in different unsaturated soils.
Abstract: Amongst the consistently fluctuating conditions
prevailing today, changeability represents a strategic key factor for a
manufacturing company to achieve success on the international
markets. In order to cope with turbulences and the increasing level of
incalculability, not only the flexible design of production systems but
in particular the employee as enabler of change provide the focus
here. It is important to enable employees from manufacturing
companies to participate actively in change events and in change
decisions. To this end, the learning factory has been created, which is
intended to serve the development of change-promoting competences
and the sensitization of employees for the necessity of changes.
Abstract: We constructed a method of phase unwrapping for a typical wave-front by utilizing the maximizer of the posterior marginal (MPM) estimate corresponding to equilibrium statistical mechanics of the three-state Ising model on a square lattice on the basis of an analogy between statistical mechanics and Bayesian inference. We investigated the static properties of an MPM estimate from a phase diagram using Monte Carlo simulation for a typical wave-front with synthetic aperture radar (SAR) interferometry. The simulations clarified that the surface-consistency conditions were useful for extending the phase where the MPM estimate was successful in phase unwrapping with a high degree of accuracy and that introducing prior information into the MPM estimate also made it possible to extend the phase under the constraint of the surface-consistency conditions with a high degree of accuracy. We also found that the MPM estimate could be used to reconstruct the original wave-fronts more smoothly, if we appropriately tuned hyper-parameters corresponding to temperature to utilize fluctuations around the MAP solution. Also, from the viewpoint of statistical mechanics of the Q-Ising model, we found that the MPM estimate was regarded as a method for searching the ground state by utilizing thermal fluctuations under the constraint of the surface-consistency condition.