Abstract: This study examined a habitat-suitability assessment method namely the Ecological Niche Factor Analysis (ENFA). A virtual species was created and then dispatched in a geographic information system model of a real landscape in three historic scenarios: (1) spreading, (2) equilibrium, and (3) overabundance. In each scenario, the virtual species was sampled and these simulated data sets were used as inputs for the ENFA to reconstruct the habitat suitability model. The 'equilibrium' scenario gives the highest quantity and quality among three scenarios. ENFA was sensitive to the distribution scenarios but not sensitive to sample sizes. The use of a virtual species proved to be a very efficient method, allowing one to fully control the quality of the input data as well as to accurately evaluate the predictive power of the analyses.
Abstract: Induction machine models used for steady-state and
transient analysis require machine parameters that are usually
considered design parameters or data. The knowledge of induction
machine parameters is very important for Indirect Field Oriented
Control (IFOC). A mismatched set of parameters will degrade the
response of speed and torque control. This paper presents an
improvement approach on rotor time constant adaptation in IFOC for
Induction Machines (IM). Our approach tends to improve the
estimation accuracy of the fundamental model for flux estimation.
Based on the reduced order of the IM model, the rotor fluxes and
rotor time constant are estimated using only the stator currents and
voltages. This reduced order model offers many advantages for real
time identification parameters of the IM.
Abstract: Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: Uncertainties of a serial production line affect on the
production throughput. The uncertainties cannot be prevented in a
real production line. However the uncertain conditions can be
controlled by a robust prediction model. Thus, a hybrid model
including autoregressive integrated moving average (ARIMA) and
multiple polynomial regression, is proposed to model the nonlinear
relationship of production uncertainties with throughput. The
uncertainties under consideration of this study are demand, breaktime,
scrap, and lead-time. The nonlinear relationship of production
uncertainties with throughput are examined in the form of quadratic
and cubic regression models, where the adjusted R-squared for
quadratic and cubic regressions was 98.3% and 98.2%. We optimized
the multiple quadratic regression (MQR) by considering the time
series trend of the uncertainties using ARIMA model. Finally the
hybrid model of ARIMA and MQR is formulated by better adjusted
R-squared, which is 98.9%.
Abstract: Renewed interest in propeller propulsion on aircraft
configurations combined with higher propeller loads lead to the question how the effects of the propulsion on model support disturbances
should be accounted for. In this paper, the determination of engine power effects on support interference of sting-mounted models is
demonstrated by a measurement on a four-engine turboprop aircraft.
CFD results on a more generic model are presented in order to clarify
the possible mechanism behind engine power effects on support
interference. The engine slipstream induces a local change in angle
of sideslip at the model sting thereby influencing the sting near-field and far-field effects. Whether or not the net result of these changes
in the disturbance pattern leads to a significant engine power effect depends on the configuration of the wind tunnel model and the test
setup.
Abstract: In the previous multi-solid models,¤ò approach is
used for the calculation of fugacity in the liquid phase. For the first
time, in the proposed multi-solid thermodynamic model,γ approach
has been used for calculation of fugacity in the liquid mixture.
Therefore, some activity coefficient models have been studied that
the results show that the predictive Wilson model is more appropriate
than others. The results demonstrate γ approach using the predictive
Wilson model is in more agreement with experimental data than the
previous multi-solid models. Also, by this method, generates a new
approach for presenting stability analysis in phase equilibrium
calculations. Meanwhile, the run time in γ approach is less than the
previous methods used ¤ò approach. The results of the new model
present 0.75 AAD % (Average Absolute Deviation) from the
experimental data which is less than the results error of the previous
multi-solid models obviously.
Abstract: Quantitative precipitation forecast (QPF) from
atmospheric model as input to hydrological model in an integrated
hydro-meteorological flood forecasting system has been operational
in many countries worldwide. High-resolution numerical weather
prediction (NWP) models with grid cell sizes between 2 and 14 km
have great potential in contributing towards reasonably accurate QPF.
In this study the potential of two NWP models to forecast
precipitation for a flood-prone area in a tropical region is examined.
The precipitation forecasts produced from the Fifth Generation Penn
State/NCAR Mesoscale (MM5) and Weather Research and
Forecasting (WRF) models are statistically verified with the observed
rain in Kelantan River Basin, Malaysia. The statistical verification
indicates that the models have performed quite satisfactorily for low
and moderate rainfall but not very satisfactory for heavy rainfall.
Abstract: Three dimensional analysis of thermal model in laser
full penetration welding, Nd:YAG, by transparent mode DP600 alloy
steel 1.25mm of thickness and gap of 0.1mm. Three models studied
the influence of thermal dependent temperature properties, thermal
independent temperature and the effect of peak value of specific heat
at phase transformation temperature, AC1, on the transient
temperature. Another seven models studied the influence of
discretization, meshes on the temperature distribution in weld plate.
It is shown that for the effects of thermal properties, the errors less
4% of maximum temperature in FZ and HAZ have identified. The
minimum value of discretization are at least one third increment per
radius for temporal discretization and the spatial discretization
requires two elements per radius and four elements through thickness
of the assembled plate, which therefore represent the minimum
requirements of modeling for the laser welding in order to get
minimum errors less than 5% compared to the fine mesh.
Abstract: This paper analyzes the patterns of the Monte Carlo
data for a large number of variables and minterms, in order to
characterize the circuit path length behavior. We propose models
that are determined by training process of shortest path length
derived from a wide range of binary decision diagram (BDD)
simulations. The creation of the model was done use of feed forward
neural network (NN) modeling methodology. Experimental results
for ISCAS benchmark circuits show an RMS error of 0.102 for the
shortest path length complexity estimation predicted by the NN
model (NNM). Use of such a model can help reduce the time
complexity of very large scale integrated (VLSI) circuitries and
related computer-aided design (CAD) tools that use BDDs.
Abstract: In this paper we present an autoregressive model with
neural networks modeling and standard error backpropagation
algorithm training optimization in order to predict the gross domestic
product (GDP) growth rate of four countries. Specifically we propose
a kind of weighted regression, which can be used for econometric
purposes, where the initial inputs are multiplied by the neural
networks final optimum weights from input-hidden layer after the
training process. The forecasts are compared with those of the
ordinary autoregressive model and we conclude that the proposed
regression-s forecasting results outperform significant those of
autoregressive model in the out-of-sample period. The idea behind
this approach is to propose a parametric regression with weighted
variables in order to test for the statistical significance and the
magnitude of the estimated autoregressive coefficients and
simultaneously to estimate the forecasts.
Abstract: In many cases, there are some time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrate models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long term research project is given to compare the suggested model with the MpO model.
Abstract: This study proposes a multi-response surface
optimization problem (MRSOP) for determining the proper choices
of a process parameter design (PPD) decision problem in a noisy
environment of a grease position process in an electronic industry.
The proposed models attempts to maximize dual process responses
on the mean of parts between failure on left and right processes. The
conventional modified simplex method and its hybridization of the
stochastic operator from the hunting search algorithm are applied to
determine the proper levels of controllable design parameters
affecting the quality performances. A numerical example
demonstrates the feasibility of applying the proposed model to the
PPD problem via two iterative methods. Its advantages are also
discussed. Numerical results demonstrate that the hybridization is
superior to the use of the conventional method. In this study, the
mean of parts between failure on left and right lines improve by
39.51%, approximately. All experimental data presented in this
research have been normalized to disguise actual performance
measures as raw data are considered to be confidential.
Abstract: In this research we show that the dynamics of an action potential in a cell can be modeled with a linear combination of the dynamics of the gating state variables. It is shown that the modeling error is negligible. Our findings can be used for simplifying cell models and reduction of computational burden i.e. it is useful for simulating action potential propagation in large scale computations like tissue modeling. We have verified our finding with the use of several cell models.
Abstract: In this paper, we are concerned with the design and
its simulation studies of a modified extremum seeking control for
nonlinear systems. A standard extremum seeking control has a simple
structure, but it takes a long time to reach an optimal operating point.
We consider a modification of the standard extremum seeking control
which is aimed to reach the optimal operating point more speedily
than the standard one. In the modification, PD acceleration term
is added before an integrator making a principal control, so that it
enables the objects to be regulated to the optimal point smoothly. This
proposed method is applied to Monod and Williams-Otto models to
investigate its effectiveness. Numerical simulation results show that
this modified method can improve the time response to the optimal
operating point more speedily than the standard one.
Abstract: This paper describes an automatic algorithm to restore
the shape of three-dimensional (3D) left ventricle (LV) models created
from magnetic resonance imaging (MRI) data using a geometry-driven
optimization approach. Our basic premise is to restore the LV shape
such that the LV epicardial surface is smooth after the restoration. A
geometrical measure known as the Minimum Principle Curvature (κ2)
is used to assess the smoothness of the LV. This measure is used to
construct the objective function of a two-step optimization process.
The objective of the optimization is to achieve a smooth epicardial
shape by iterative in-plane translation of the MRI slices.
Quantitatively, this yields a minimum sum in terms of the magnitude
of κ
2, when κ2 is negative. A limited memory quasi-Newton algorithm,
L-BFGS-B, is used to solve the optimization problem. We tested our
algorithm on an in vitro theoretical LV model and 10 in vivo
patient-specific models which contain significant motion artifacts. The
results show that our method is able to automatically restore the shape
of LV models back to smoothness without altering the general shape of
the model. The magnitudes of in-plane translations are also consistent
with existing registration techniques and experimental findings.
Abstract: There are many approaches proposed for solving
Sudoku puzzles. One of them is by modelling the puzzles as block
world problems. There have been three model for Sudoku solvers
based on this approach. Each model expresses Sudoku solver as
a parameterized multi agent systems. In this work, we propose a
new model which is an improvement over the existing models. This
paper presents the development of a Sudoku solver that implements
all the proposed models. Some experiments have been conducted to
determine the performance of each model.
Abstract: The application of a simple microcontroller to deal
with a three variable input and a single output fuzzy logic controller,
with Proportional – Integral – Derivative (PID) response control
built-in has been tested for an automatic voltage regulator. The
fuzzifiers are based on fixed range of the variables of output voltage.
The control output is used to control the wiper motor of the auto
transformer to adjust the voltage, using fuzzy logic principles, so that
the voltage is stabilized. In this report, the author will demonstrate
how fuzzy logic might provide elegant and efficient solutions in the
design of multivariable control based on experimental results rather
than on mathematical models.
Abstract: Nowadays, under developed countries for progress in
science and technology and decreasing the technologic gap with
developed countries, increasing the capacities and technology
transfer from developed countries. To remain competitive, industry is
continually searching for new methods to evolve their products.
Business model is one of the latest buzzwords in the Internet and
electronic business world. To be successful, organizations must look
into the needs and wants of their customers. This research attempts to
identify a specific feature of the company with a strong competitive
advantage by analyzing the cause of Customer satisfaction. Due to
the rapid development of knowledge and information technology,
business environments have become much more complicated.
Information technology can help a firm aiming to gain a competitive
advantage. This study explores the role and effect of Information
Communication Technology in Business Models and Customer
satisfaction on firms and also relationships between ICTs and
Outsourcing strategic.
Abstract: Cognitive models allow predicting some aspects of utility
and usability of human machine interfaces (HMI), and simulating
the interaction with these interfaces. The action of predicting is based
on a task analysis, which investigates what a user is required to do
in terms of actions and cognitive processes to achieve a task. Task
analysis facilitates the understanding of the system-s functionalities.
Cognitive models are part of the analytical approaches, that do not
associate the users during the development process of the interface.
This article presents a study about the evaluation of a human
machine interaction with a contextual assistant-s interface using ACTR
and GOMS cognitive models. The present work shows how these
techniques may be applied in the evaluation of HMI, design and
research by emphasizing firstly the task analysis and secondly the
time execution of the task. In order to validate and support our
results, an experimental study of user performance is conducted at
the DOMUS laboratory, during the interaction with the contextual
assistant-s interface. The results of our models show that the GOMS
and ACT-R models give good and excellent predictions respectively
of users performance at the task level, as well as the object level.
Therefore, the simulated results are very close to the results obtained
in the experimental study.