Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: In recent years, response surface methodology (RSM) has
brought many attentions of many quality engineers in different
industries. Most of the published literature on robust design
methodology is basically concerned with optimization of a single
response or quality characteristic which is often most critical to
consumers. For most products, however, quality is multidimensional,
so it is common to observe multiple responses in an experimental
situation. Through this paper interested person will be familiarize
with this methodology via surveying of the most cited technical
papers.
It is believed that the proposed procedure in this study can resolve
a complex parameter design problem with more than two responses.
It can be applied to those areas where there are large data sets and a
number of responses are to be optimized simultaneously. In addition,
the proposed procedure is relatively simple and can be implemented
easily by using ready-made standard statistical packages.
Abstract: The objectives of this research were to explore factors
influencing knowledge management process in the manufacturing
industry and develop a model to support knowledge management
processes. The studied factors were technology infrastructure, human
resource, knowledge sharing, and the culture of the organization. The
knowledge management processes included discovery, capture,
sharing, and application. Data were collected through questionnaires
and analyzed using multiple linear regression and multiple
correlation. The results found that technology infrastructure, human
resource, knowledge sharing, and culture of the organization
influenced the discovery and capture processes. However, knowledge
sharing had no influence in sharing and application processes. A
model to support knowledge management processes was developed,
which indicated that sharing knowledge needed further improvement
in the organization.
Abstract: In view of growing competition in the service sector,
services are as much in need of modeling, analysis and improvement
as business or working processes. Graphical process models are
important means to capture process-related know-how for an
effective management of the service process. In this contribution, a
human performance analysis of process model development paying
special attention to model development time and the working method
was conducted. It was found that modelers with higher application
experience need significantly less time for mental activities than
modelers with lower application experience, spend more time on
labeling graphical elements, and achieved higher process model
quality in terms of activity label quality.
Abstract: A predictive clustering hybrid regression (pCHR)
approach was developed and evaluated using dataset from H2-
producing sucrose-based bioreactor operated for 15 months. The aim
was to model and predict the H2-production rate using information
available about envirome and metabolome of the bioprocess. Selforganizing
maps (SOM) and Sammon map were used to visualize the
dataset and to identify main metabolic patterns and clusters in
bioprocess data. Three metabolic clusters: acetate coupled with other
metabolites, butyrate only, and transition phases were detected. The
developed pCHR model combines principles of k-means clustering,
kNN classification and regression techniques. The model performed
well in modeling and predicting the H2-production rate with mean
square error values of 0.0014 and 0.0032, respectively.
Abstract: Nowaday-s, many organizations use systems that
support business process as a whole or partially. However, in some
application domains, like software development and health care
processes, a normative Process Aware System (PAS) is not suitable,
because a flexible support is needed to respond rapidly to new
process models. On the other hand, a flexible Process Aware System
may be vulnerable to undesirable and fraudulent executions, which
imposes a tradeoff between flexibility and security. In order to make
this tradeoff available, a genetic-based anomaly detection model for
logs of Process Aware Systems is presented in this paper. The
detection of an anomalous trace is based on discovering an
appropriate process model by using genetic process mining and
detecting traces that do not fit the appropriate model as anomalous
trace; therefore, when used in PAS, this model is an automated
solution that can support coexistence of flexibility and security.
Abstract: Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.
Abstract: The aim of a biological model is to understand the
integrated structure and behavior of complex biological systems as a
function of the underlying molecular networks to achieve simulation
and forecast of their operation. Although several approaches have
been introduced to take into account structural and environment
related features, relatively little attention has been given to represent
the behavior of biological systems. The Abstract Biological Process
(ABP) model illustrated in this paper is an object-oriented model
based on UML (the standard object-oriented language). Its main
objective is to bring into focus the functional aspects of the
biological system under analysis.
Abstract: This paper presents a research agenda on the SCOR
model adaptation. SCOR model is designated to measure supply
chain performance and logistics impact across the boundaries of
individual organizations. It is at its growing stage of its life cycle and
is enjoying the leverage of becoming the industry standard. The
SCOR model has been developed and used widely in developed
countries context. This research focuses on the SCOR model
adaptation for the manufacturing industry in developing countries.
With a necessary understanding of the characteristics, difficulties and
problems of the manufacturing industry in developing countries-
supply chain; consequently, we will try to designs an adapted model
with its building blocks: business process model, performance
measures and best practices.
Abstract: This paper is focused on issues of process modeling
and two model based control strategies of a fed-batch sugar
crystallization process applying the concept of artificial neural
networks (ANNs). The control objective is to force the operation into
following optimal supersaturation trajectory. It is achieved by
manipulating the feed flow rate of sugar liquor/syrup, considered as
the control input. The control task is rather challenging due to the
strong nonlinearity of the process dynamics and variations in the
crystallization kinetics. Two control alternatives are considered –
model predictive control (MPC) and feedback linearizing control
(FLC). Adequate ANN process models are first built as part of the
controller structures. MPC algorithm outperforms the FLC approach
with respect to satisfactory reference tracking and smooth control
action. However, the MPC is computationally much more involved
since it requires an online numerical optimization, while for the FLC
an analytical control solution was determined.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: This paper is focused on issues of nonlinear dynamic process modeling and model-based predictive control of a fed-batch sugar crystallization process applying the concept of artificial neural networks as computational tools. The control objective is to force the operation into following optimal supersaturation trajectory. It is achieved by manipulating the feed flow rate of sugar liquor/syrup, considered as the control input. A feed forward neural network (FFNN) model of the process is first built as part of the controller structure to predict the process response over a specified (prediction) horizon. The predictions are supplied to an optimization procedure to determine the values of the control action over a specified (control) horizon that minimizes a predefined performance index. The control task is rather challenging due to the strong nonlinearity of the process dynamics and variations in the crystallization kinetics. However, the simulation results demonstrated smooth behavior of the control actions and satisfactory reference tracking.
Abstract: Nowadays, the pace of business change is such that,
increasingly, new functionality has to be realized and reliably
installed in a matter of days, or even hours. Consequently, more and
more business processes are prone to a continuous change. The
objective of the research in progress is to use the MAP model, in a
conceptual modeling method for flexible and adaptive business
process. This method can be used to capture the flexibility
dimensions of a business process; it takes inspiration from
modularity concept in the object oriented paradigm to establish a
hierarchical construction of the BP modeling. Its intent is to provide
a flexible modeling that allows companies to quickly adapt their
business processes.
Abstract: Software engineering education not only embraces
technical skills of software development but also necessitates
communication and interaction among learners. In this paper, it is
proposed to adapt the PBL methodology that is especially designed to
be integrated into software engineering classroom in order to promote
collaborative learning environment. This approach helps students
better understand the significance of social aspects and provides a
systematic framework to enhance teamwork skills. The adaptation of
PBL facilitates the transition to an innovative software development
environment where cooperative learning can be actualized.
Abstract: Curing of paints by exposure to UV radiations is
emerging as one of the best film forming technique as an alternative
to traditional solvent borne oxidative and thermal curing coatings.
The composition and chemistry of UV curable coatings and role of
multifunctional and monofunctional monomers, oligomers, and
photoinitiators have been discussed. The limitations imposed by
thermodynamic equilibrium and tendency for acrylic double bond
polymerizations during synthesis of multifunctional acrylates have
been presented. Aim of present investigation was thus to explore the
reaction variables associated with synthesis of multifunctional
acrylates. Zirconium oxychloride was evaluated as catalyst against
regular acid functional catalyst. The catalyzed synthesis of glyceryl
acrylate and neopentyl glycol acrylate was conducted by variation of
following reaction parameters: two different reactant molar ratios-
1:4 and 1:6; catalyst usage in % by moles on polyol- 2.5, 5.0 and 7.5
and two different reaction temperatures- 45 and 75 0C. The reaction
was monitored by determination of acid value and hydroxy value at
regular intervals, besides TLC, HPLC, and FTIR analysis of
intermediates and products. On the basis of determination of reaction
progress over 1-60 hrs, the esterification reaction was observed to
follow 2nd order kinetics with rate constant varying from 1*10-4 to
7*10-4. The thermal and catalytic components of second order rate
constant and energy of activation were also determined. Uses of
these kinetic and thermodynamic parameters in design of reactor for
manufacture of multifunctional acrylate ester have been presented.
The synthesized multifunctional acrylates were used to formulate and
apply UV curable clear coat followed by determination of curing
characteristics and mechanical properties of cured film. The overall
curing rates less than 05 min. were easily attained indicating
economical viability of radiation curable system due to faster
production schedules
Abstract: New methodologies for XOR-XNOR circuits are
proposed to improve the speed and power as these circuits are basic
building blocks of many arithmetic circuits. This paper evaluates and
compares the performance of various XOR-XNOR circuits. The
performance of the XOR-XNOR circuits based on TSMC 0.18μm
process models at all range of the supply voltage starting from 0.6V
to 3.3V is evaluated by the comparison of the simulation results
obtained from HSPICE. Simulation results reveal that the proposed
circuit exhibit lower PDP and EDP, more power efficient and faster
when compared with best available XOR-XNOR circuits in the
literature.
Abstract: Business Process Modeling (BPM) is the first and
most important step in business process management lifecycle. Graph
based formalism and rule based formalism are the two most
predominant formalisms on which process modeling languages are
developed. BPM technology continues to face challenges in coping
with dynamic business environments where requirements and goals
are constantly changing at the execution time. Graph based
formalisms incur problems to react to dynamic changes in Business
Process (BP) at the runtime instances. In this research, an adaptive
and flexible framework based on the integration between Object
Oriented diagramming technique and Petri Net modeling language is
proposed in order to support change management techniques for
BPM and increase the representation capability for Object Oriented
modeling for the dynamic changes in the runtime instances. The
proposed framework is applied in a higher education environment to
achieve flexible, updatable and dynamic BP.
Abstract: Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: The software system goes through a number of stages
during its life and a software process model gives a standard format
for planning, organizing and running a project. The article presents a
new software development process model named as “Divide and
Conquer Process Model", based on the idea first it divides the things
to make them simple and then gathered them to get the whole work
done. The article begins with the backgrounds of different software
process models and problems in these models. This is followed by a
new divide and conquer process model, explanation of its different
stages and at the end edge over other models is shown.