Abstract: Market competition and a desire to gain advantages on globalized market, drives companies towards innovation efforts. Project overload is an unpleasant phenomenon, which is happening for employees inside those organizations trying to make the most efficient use of their resources to be innovative. But what are the impacts of project overload on organization-s innovation capabilities? Advanced engineering teams (AE) inside a major heavy equipment manufacturer are suffering from project overload in their quest for innovation. In this paper, Agent-based modeling (ABM) is used to examine the current reality of the company context, and of the AE team, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. Project overload is more likely to stifle innovation and creativity inside teams. On the other hand, motivations on proper challenging goals are more likely to help individual to alleviate the negative aspects of low level of project overload.
Abstract: This article illustrates a model selection management approach for virtual prototypes in interactive simulations. In those numerical simulations, the virtual prototype and its environment are modelled as a multiagent system, where every entity (prototype,human, etc.) is modelled as an agent. In particular, virtual prototyp ingagents that provide mathematical models of mechanical behaviour inform of computational methods are considered. This work argues that selection of an appropriate model in a changing environment,supported by models? characteristics, can be managed by the deter-mination a priori of specific exploitation and performance measures of virtual prototype models. As different models exist to represent a single phenomenon, it is not always possible to select the best one under all possible circumstances of the environment. Instead the most appropriate shall be selecting according to the use case. The proposed approach consists in identifying relevant metrics or indicators for each group of models (e.g. entity models, global model), formulate their qualification, analyse the performance, and apply the qualification criteria. Then, a model can be selected based on the performance prediction obtained from its qualification. The authors hope that this approach will not only help to inform engineers and researchers about another approach for selecting virtual prototype models, but also assist virtual prototype engineers in the systematic or automatic model selection.
Abstract: This paper presents a time control liquids mixing
system in the tanks as an application of fuzzy time control discrete
model. The system is designed for a wide range of industrial
applications. The simulation design of control system has three
inputs: volume, viscosity, and selection of product, along with the
three external control adjustments for the system calibration or to
take over the control of the system autonomously in local or
distributed environment. There are four controlling elements: rotatory
motor, grinding motor, heating and cooling units, and valves
selection, each with time frame limit. The system consists of three
controlled variables measurement through its sensing mechanism for
feed back control. This design also facilitates the liquids mixing
system to grind certain materials in tanks and mix with fluids under
required temperature controlled environment to achieve certain
viscous level. Design of: fuzzifier, inference engine, rule base,
deffuzifiers, and discrete event control system, is discussed. Time
control fuzzy rules are formulated, applied and tested using
MATLAB simulation for the system.
Abstract: Flour from Mucuna beans (Mucuna pruriens) were
used in producing texturized meat analogue using a single screw
extruder to monitor modifications on the proximate composition and
the functional properties at high moisture level. Response surface
methodology based on Box Behnken design at three levels of barrel
temperature (110, 120, 130°C), screw speed (100,120,140rpm) and
feed moisture (44, 47, 50%) were used in 17 runs. Regression models
describing the effect of variables on the product responses were
obtained. Descriptive profile analyses and consumer acceptability
test were carried out on optimized flavoured extruded meat analogue.
Responses were mostly affected by barrel temperature and moisture
level and to a lesser extent by screw speed. Optimization results
based on desirability concept indicated that a barrel temperature of
120.15°C, feed moisture of 47% and screw speed of 119.19 rpm
would produce meat analogue of preferable proximate composition,
functional and sensory properties which reveals consumers` likeness
for the product.
Abstract: We report on the development of a model to
understand why the range of experience with respect to HIV
infection is so diverse, especially with respect to the latency period.
To investigate this, an agent-based approach is used to extract highlevel
behaviour which cannot be described analytically from the set
of interaction rules at the cellular level. A network of independent
matrices mimics the chain of lymph nodes. Dealing with massively
multi-agent systems requires major computational effort. However,
parallelisation methods are a natural consequence and advantage of
the multi-agent approach and, using the MPI library, are here
implemented, tested and optimized. Our current focus is on the
various implementations of the data transfer across the network.
Three communications strategies are proposed and tested, showing
that the most efficient approach is communication based on the
natural lymph-network connectivity.
Abstract: Neural networks offer an alternative approach both
for identification and control of nonlinear processes in process
engineering. The lack of software tools for the design of controllers
based on neural network models is particularly pronounced in this
field. SIMULINK is properly a widely used graphical code
development environment which allows system-level developers to
perform rapid prototyping and testing. Such graphical based
programming environment involves block-based code development
and offers a more intuitive approach to modeling and control task in
a great variety of engineering disciplines. In this paper a
SIMULINK based Neural Tool has been developed for analysis and
design of multivariable neural based control systems. This tool has
been applied to the control of a high purity distillation column
including non linear hydrodynamic effects. The proposed control
scheme offers an optimal response for both theoretical and practical
challenges posed in process control task, in particular when both,
the quality improvement of distillation products and the operation
efficiency in economical terms are considered.
Abstract: The liberalization and privatization processes have
forced public utility companies to face new competitive challenges,
implementing strategies to gain market share and, at the same time,
keep the old customers. To this end, many companies have carried
out mergers, acquisitions and conglomerations in order to diversify
their business. This paper focuses on companies operating in the free
energy market in Italy. In the last decade, this sector has undergone
profound changes that have radically changed the competitive
scenario and have led companies to implement diversification
strategies of the business. Our work aims to evaluate the economic
and financial performances obtained by energy companies, following
the beginning of the liberalization process, verifying the possible
relationship with the implemented diversification strategies.
Abstract: Droughts are complex, natural hazards that, to a
varying degree, affect some parts of the world every year. The range
of drought impacts is related to drought occurring in different stages
of the hydrological cycle and usually different types of droughts,
such as meteorological, agricultural, hydrological, and socioeconomical
are distinguished. Streamflow drought was analyzed by
the method of truncation level (at 70% level) on daily discharges
measured in 54 hydrometric stations in southwestern Iran. Frequency
analysis was carried out for annual maximum series (AMS) of
drought deficit volume and duration series. Some factors including
physiographic, climatic, geologic, and vegetation cover were studied
as influential factors in the regional analysis. According to the results
of factor analysis, six most effective factors were identified as area,
rainfall from December to February, the percent of area with
Normalized Difference Vegetation Index (NDVI)
Abstract: In this study, we explore the use of information for inventory decision in the healthcare organization (HO). We consider the scenario when the HO can make use of the information collected from some correlated products to enhance its inventory planning. Motivated by our real world observations that HOs adopt RFID and bar-coding system for information collection purpose, we examine the effectiveness of these systems for inventory planning with Bayesian information updating. We derive the optimal ordering decision and study the issue of Pareto improvement in the supply chain. Our analysis demonstrates that RFID system will outperform the bar-coding system when the RFID system installation cost and the tag cost reduce to a level that is comparable with that of the barcoding system. We also show how an appropriately set wholesale pricing contract can achieve Pareto improvement in the HO supply chain.
Abstract: Uterine and oviducal fluids are necessary for
capacitation of the spermatozoa and early embryonic development.
The aim of the present study was to determine the effects of estrous
cycle phases (follicular and luteal) on some biological parameters
(enzymes, electrolytes and total proteins) in uterine and oviducal
secretions of ewes. Oviducal and uterine fluids were collected,
diluted and centrifuged. According to our results, concentrations of
GPT, G6PDH, total proteins, K and Na were significantly (P
Abstract: This paper critiques several exiting strategic
international human resource management (SIHRM) frameworks and
discusses their limitations to apply directly to emerging multinational
enterprises (EMNEs), especially those generated from China and
other BRICS nations. To complement the existing SIHRM
frameworks, key variables relevant to emerging economies are
identified and the extended model with particular reference to
EMNEs is developed with several research propositions. It is
believed that the extended model would better capture the recent
development of MNEs in transition, and alert emerging international
managers to address several human resource management challenges
in the global context
Abstract: The Korean government has applied preliminary feasibility study for new and huge R&D programs since 2008.The study is carried out from the viewpoints of technology, policy, and Economics. Then integrate the separate analysis and finally arrive at a definite result; whether a program is feasible or unfeasible, This paper describes the concept and method of the feasibility analysis focused on technological viability assessment for technical analysis. It consists of technology trend assessment and technology level assessment. Through the analysis, we can determine the chance of schedule delay or cost overrun occurring in the proposed plan.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.
Abstract: Urban road network traffic has become one of the
most studied research topics in the last decades. This is mainly due to
the enlargement of the cities and the growing number of motor
vehicles traveling in this road network. One of the most sensitive
problems is to verify if the network is congestion-free. Another
related problem is the automatic reconfiguration of the network
without building new roads to alleviate congestions. These problems
require an accurate model of the traffic to determine the steady state
of the system. An alternative is to simulate the traffic to see if there
are congestions and when and where they occur. One key issue is to
find an adequate model for road intersections. Once the model
established, either a large scale model is built or the intersection is
represented by its performance measures and simulation for analysis.
In both cases, it is important to seek the queueing model to represent
the road intersection. In this paper, we propose to model the road
intersection as a BCMP queueing network and we compare this
analytical model against a simulation model for validation.
Abstract: The Integrated Performance Modelling Environment
(IPME) is a powerful simulation engine for task simulation and
performance analysis. However, it has no high level cognition such
as memory and reasoning for complex simulation. This article
introduces a knowledge representation and reasoning scheme that can
accommodate uncertainty in simulations of military personnel with
IPME. This approach demonstrates how advanced reasoning models
that support similarity-based associative process, rule-based abstract
process, multiple reasoning methods and real-time interaction can be
integrated with conventional task network modelling to provide
greater functionality and flexibility when modelling operator
performance.
Abstract: Prostate cancer is one of the most frequent cancers in men and is a major cause of mortality in the most of countries. In many diagnostic and treatment procedures for prostate disease accurate detection of prostate boundaries in transrectal ultrasound (TRUS) images is required. This is a challenging and difficult task due to weak prostate boundaries, speckle noise and the short range of gray levels. In this paper a novel method for automatic prostate segmentation in TRUS images is presented. This method involves preprocessing (edge preserving noise reduction and smoothing) and prostate segmentation. The speckle reduction has been achieved by using stick filter and top-hat transform has been implemented for smoothing. A feed forward neural network and local binary pattern together have been use to find a point inside prostate object. Finally the boundary of prostate is extracted by the inside point and an active contour algorithm. A numbers of experiments are conducted to validate this method and results showed that this new algorithm extracted the prostate boundary with MSE less than 4.6% relative to boundary provided manually by physicians.
Abstract: System-level design based on high-level abstractions
is becoming increasingly important in hardware and embedded
system design. This paper analyzes meta-design techniques oriented
at developing meta-programs and meta-models for well-understood
domains. Meta-design techniques include meta-programming and
meta-modeling. At the programming level of design process, metadesign
means developing generic components that are usable in a
wider context of application than original domain components. At the
modeling level, meta-design means developing design patterns that
describe general solutions to the common recurring design problems,
and meta-models that describe the relationship between different
types of design models and abstractions. The paper describes and
evaluates the implementation of meta-design in hardware design
domain using object-oriented and meta-programming techniques.
The presented ideas are illustrated with a case study.
Abstract: In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.
Abstract: Electrocardiogram (ECG) is considered to be the
backbone of cardiology. ECG is composed of P, QRS & T waves and
information related to cardiac diseases can be extracted from the
intervals and amplitudes of these waves. The first step in extracting
ECG features starts from the accurate detection of R peaks in the
QRS complex. We have developed a robust R wave detector using
wavelets. The wavelets used for detection are Daubechies and
Symmetric. The method does not require any preprocessing therefore,
only needs the ECG correct recordings while implementing the
detection. The database has been collected from MIT-BIH arrhythmia
database and the signals from Lead-II have been analyzed. MatLab
7.0 has been used to develop the algorithm. The ECG signal under
test has been decomposed to the required level using the selected
wavelet and the selection of detail coefficient d4 has been done based
on energy, frequency and cross-correlation analysis of decomposition
structure of ECG signal. The robustness of the method is apparent
from the obtained results.