Abstract: The paper focuses on the benefits of business process
modeling. Although this discipline is developing for many years,
there is still necessity of creating new opportunities to meet the ever
increasing users’ needs. Because one of these needs is related to the
conversion of business process models from one standard to another,
the authors have developed a converter between BPMN and EPC
standards using workflow patterns as intermediate tool. Nowadays
there are too many systems for business process modeling. The
variety of output formats is almost the same as the systems
themselves. This diversity additionally hampers the conversion of the
models. The presented study is aimed at discussing problems due to
differences in the output formats of various modeling environments.
Abstract: We present probabilistic multinomial Dirichlet
classification model for multidimensional data and Gaussian process
priors. Here, we have considered efficient computational method that
can be used to obtain the approximate posteriors for latent variables
and parameters needed to define the multiclass Gaussian process
classification model. We first investigated the process of inducing a
posterior distribution for various parameters and latent function by
using the variational Bayesian approximations and important sampling
method, and next we derived a predictive distribution of latent
function needed to classify new samples. The proposed model is
applied to classify the synthetic multivariate dataset in order to verify
the performance of our model. Experiment result shows that our model
is more accurate than the other approximation methods.
Abstract: The Petri nets are the first standard for business
process modeling. Most probably, it is one of the core reasons why
all new standards created afterwards have to be so reformed as to
reach the stage of mapping the new standard onto Petri nets. The paper presents a business process repository based on a
universal database. The repository provides the possibility the data
about a given process to be stored in three different ways. Business
process repository is developed with regard to the reformation of a
given model to a Petri net in order to be easily simulated. Two different techniques for business process simulation based on
Petri nets - Yasper and Woflan are discussed. Their advantages and
drawbacks are outlined. The way of simulating business process
models, stored in the Business process repository is shown.
Abstract: Fading noise degrades the performance of cellular
communication, most notably in femto- and pico-cells in 3G and 4G
systems. When the wireless channel consists of a small number of
scattering paths, the statistics of fading noise is not analytically
tractable and poses a serious challenge to developing closed
canonical forms that can be analysed and used in the design of
efficient and optimal receivers. In this context, noise is multiplicative
and is referred to as stochastically local fading. In many analytical
investigation of multiplicative noise, the exponential or Gamma
statistics are invoked. More recent advances by the author of this
paper utilized a Poisson modulated-weighted generalized Laguerre
polynomials with controlling parameters and uncorrelated noise
assumptions. In this paper, we investigate the statistics of multidiversity
stochastically local area fading channel when the channel
consists of randomly distributed Rayleigh and Rician scattering
centers with a coherent Nakagami-distributed line of sight component
and an underlying doubly stochastic Poisson process driven by a
lognormal intensity. These combined statistics form a unifying triply
stochastic filtered marked Poisson point process model.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Verification and Validation of Simulated Process
Model is the most important phase of the simulator life cycle.
Evaluation of simulated process models based on Verification and
Validation techniques checks the closeness of each component model
(in a simulated network) with the real system/process with respect to
dynamic behaviour under steady state and transient conditions. The
process of Verification and Validation helps in qualifying the process
simulator for the intended purpose whether it is for providing
comprehensive training or design verification. In general, model
verification is carried out by comparison of simulated component
characteristics with the original requirement to ensure that each step
in the model development process completely incorporates all the
design requirements. Validation testing is performed by comparing
the simulated process parameters to the actual plant process
parameters either in standalone mode or integrated mode.
A Full Scope Replica Operator Training Simulator for PFBR -
Prototype Fast Breeder Reactor has been developed at IGCAR,
Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder
Reactor Simulator) where in the main participants are
engineers/experts belonging to Modeling Team, Process Design and
Instrumentation & Control design team. This paper discusses about
the Verification and Validation process in general, the evaluation
procedure adopted for PFBR operator training Simulator, the
methodology followed for verifying the models, the reference
documents and standards used etc. It details out the importance of
internal validation by design experts, subsequent validation by
external agency consisting of experts from various fields, model
improvement by tuning based on expert’s comments, final
qualification of the simulator for the intended purpose and the
difficulties faced while co-coordinating various activities.
Abstract: In this paper, student admission process is studied to
optimize the assignment of vacant seats with three main objectives.
Utilizing all vacant seats, satisfying all programs of study admission
requirements and maintaining fairness among all candidates are the
three main objectives of the optimization model. Seat Assignment
Method (SAM) is used to build the model and solve the optimization
problem with help of Northwest Coroner Method and Least Cost
Method. A closed formula is derived for applying the priority of
assigning seat to candidate based on SAM.
Abstract: Several of the practical industrial control processes are
multivariable processes. Due to the relation amid the variables
(interaction), delay in the loops, it is very intricate to design a
controller directly for these processes. So first, the interaction of the
variables is analyzed using Relative Normalized Gain Array
(RNGA), which considers the time constant, static gain and delay
time of the processes. Based on the effect of RNGA, relative gain
array (RGA) and NI, the pair (control configuration) of variables to
be controlled by decentralized control is selected. The equivalent
transfer function (ETF) of the process model is estimated as first
order process with delay using the corresponding elements in the
Relative gain array and Relative average residence time array
(RARTA) of the processes. Secondly, a decentralized Proportional-
Integral (PI) controller is designed for each ETF simply using
frequency response specifications. Finally, the performance and
robustness of the algorithm is comparing with existing related
approaches to validate the effectiveness of the projected algorithm.
Abstract: Heightened concerns over the amount of carbon
emitted from coal-related processes are generating shifts to the
application of biomass. In co-gasification, where coal is gasified
along with biomass, the biomass may be fed together with coal (cofeeding)
or an independent biomass gasifier needs to be integrated
with the coal gasifier. The main aim of this work is to evaluate the
biomass introduction methods in coal co-gasification. This includes
the evaluation of biomass concentration input (B0 to B100) and its
gasification performance. A process model is developed and
simulated in Aspen HYSYS, where both coal and biomass are
modelled according to its ultimate analysis. It was found that the
syngas produced increased with increasing biomass content for both
co-feeding and independent schemes. However, the heating values
and heat duties decreases with biomass concentration as more CO2
are produced from complete combustion.
Abstract: Health analytics (HA) is used in healthcare systems
for effective decision making, management and planning of
healthcare and related activities. However, user resistances, unique
position of medical data content and structure (including
heterogeneous and unstructured data) and impromptu HA projects
have held up the progress in HA applications. Notably, the accuracy
of outcomes depends on the skills and the domain knowledge of the
data analyst working on the healthcare data. Success of HA depends
on having a sound process model, effective project management and
availability of supporting tools. Thus, to overcome these challenges
through an effective process model, we propose a HA process model
with features from rational unified process (RUP) model and agile
methodology.
Abstract: This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.
Abstract: This paper presents a complete dynamic modeling
of a membrane distillation process. The model contains two
consistent dynamic models. A 2D advection-diffusion equation
for modeling the whole process and a modified heat equation
for modeling the membrane itself. The complete model describes
the temperature diffusion phenomenon across the feed, membrane,
permeate containers and boundary layers of the membrane. It gives
an online and complete temperature profile for each point in the
domain. It explains heat conduction and convection mechanisms that
take place inside the process in terms of mathematical parameters, and
justify process behavior during transient and steady state phases. The
process is monitored for any sudden change in the performance at any
instance of time. In addition, it assists maintaining production rates
as desired, and gives recommendations during membrane fabrication
stages. System performance and parameters can be optimized
and controlled using this complete dynamic model. Evolution of
membrane boundary temperature with time, vapor mass transfer along
the process, and temperature difference between membrane boundary
layers are depicted and included. Simulations were performed over
the complete model with real membrane specifications. The plots
show consistency between 2D advection-diffusion model and the
expected behavior of the systems as well as literature. Evolution
of heat inside the membrane starting from transient response till
reaching steady state response for fixed and varying times is
illustrated.
Abstract: Business Processes (BPs) are the key instrument to
understand how companies operate at an organizational level, taking
an as-is view of the workflow, and how to address their issues by
identifying a to-be model. In last year’s, the BP Model and Notation
(BPMN) has become a de-facto standard for modeling processes.
However, this standard does not incorporate explicitly the Problem-
Solving (PS) knowledge in the Process Modeling (PM) results. Thus,
such knowledge cannot be shared or reused. To narrow this gap is
today a challenging research area. In this paper we present a
framework able to capture the PS knowledge and to improve a
workflow. This framework extends the BPMN specification by
incorporating new general-purpose elements. A pilot scenario is also
presented and discussed.
Abstract: This paper presents a Gaussian process model-based
short-term electric load forecasting. The Gaussian process model is
a nonparametric model and the output of the model has Gaussian
distribution with mean and variance. The multiple Gaussian process
models as every hour ahead predictors are used to forecast future
electric load demands up to 24 hours ahead in accordance with the
direct forecasting approach. The separable least-squares approach that
combines the linear least-squares method and genetic algorithm is
applied to train these Gaussian process models. Simulation results
are shown to demonstrate the effectiveness of the proposed electric
load forecasting.
Abstract: This paper presents a nonparametric identification of
continuous-time nonlinear systems by using a Gaussian process
(GP) model. The GP prior model is trained by artificial bee colony
algorithm. The nonlinear function of the objective system is estimated
as the predictive mean function of the GP, and the confidence
measure of the estimated nonlinear function is given by the predictive
covariance of the GP. The proposed identification method is applied
to modeling of a simplified electric power system. Simulation results
are shown to demonstrate the effectiveness of the proposed method.
Abstract: Optimization of business processes in trading companies is reviewed in the report. There is the presentation of the “Wholesale Customer Order Handling Process” business process model applicable for small and medium businesses. It is proposed to apply the algorithm for automation of the customer order processing which will significantly reduce labor costs and time expenditures and increase the profitability of companies. An optimized business process is an element of the information system of accounting of spare parts trading network activity. The considered algorithm may find application in the trading industry as well.
Abstract: This paper proposes a bioprocess optimization procedure based on Relevance Vector Regression models and evolutionary programming technique. Relevance Vector Regression scheme allows developing a compact and stable data-based process model avoiding time-consuming modeling expenses. The model building and process optimization procedure could be done in a half-automated way and repeated after every new cultivation run. The proposed technique was tested in a simulated mammalian cell cultivation process. The obtained results are promising and could be attractive for optimization of industrial bioprocesses.
Abstract: Natural or human made disasters have a significant negative impact on the environment. At the same time there is an extensive effort to support management and decision making in emergency situations by information technologies. Therefore the purpose of the paper is to propose a design patterns applicable in emergency management, enabling better analysis and design of emergency management processes and therefore easier development and deployment of information systems in the field of emergency management. It will be achieved by detailed analysis of existing emergency management legislation, contingency plans and information systems. The result is a set of design patterns focused at emergency management processes that enable easier design of emergency plans or development of new information system. These results will have a major impact on the development of new information systems as well as to more effective and faster solving of emergencies.
Abstract: Web-based systems have become increasingly
important due to the fact that the Internet and the World Wide Web
have become ubiquitous, surpassing all other technological
developments in our history. The Internet and especially companies
websites has rapidly evolved in their scope and extent of use, from
being a little more than fixed advertising material, i.e. a "web
presences", which had no particular influence for the company's
business, to being one of the most essential parts of the company's
core business.
Traditional software engineering approaches with process models
such as, for example, CMM and Waterfall models, do not work very
well since web system development differs from traditional
development. The development differs in several ways, for example,
there is a large gap between traditional software engineering designs
and concepts and the low-level implementation model, many of the
web based system development activities are business oriented (for
example web application are sales-oriented, web application and
intranets are content-oriented) and not engineering-oriented.
This paper aims to introduce Increment Iterative extreme
Programming (IIXP) methodology for developing web based
systems. In difference to the other existence methodologies, this
methodology is combination of different traditional and modern
software engineering and web engineering principles.
Abstract: This paper aims to present a framework for the
organizational knowledge management, which seeks to deploy a
standardized structure for the integrated management of knowledge is
a common language based on domains, processes and global
indicators inspired by the COBIT framework 5 (ISACA, 2012),
which supports the integration of three technologies, enterprise
information architecture (EIA), the business process modeling (BPM)
and service-oriented architecture (SOA). The Gomak Framework is a
management platform that seeks to integrate the information
technology infrastructure, the structure of applications, information
infrastructure, and business logic and business model to support a
sound strategy of organizational knowledge management, low
process-based approach and concurrent engineering. Concurrent
engineering (CE) is a systematic approach to integrated product
development that respond to customer expectations, involving all
perspectives in parallel, from the beginning of the product life cycle.
(European Space Agency, 2000).