Abstract: IETF RFC 2002 originally introduced the wireless
Mobile-IP protocol to support portable IP addresses for mobile
devices that often change their network access points to the Internet.
The inefficiency of this protocol mainly within the handoff
management produces large end-to-end packet delays, during
registration process, and further degrades the system efficiency due to
packet losses between subnets. The criterion to initiate a simple and
fast full-duplex connection between the home agent and foreign
agent, to reduce the roaming duration, is a very important issue to be
considered by a work in this paper. State-transition Petri-Nets of the
modeling scenario-based CIA: communication inter-agents procedure
as an extension to the basic Mobile-IP registration process was
designed and manipulated. The heuristic of configuration file during
practical Setup session for registration parameters, on Cisco platform
Router-1760 using IOS 12.3 (15)T is created. Finally, stand-alone
performance simulations results from Simulink Matlab, within each
subnet and also between subnets, are illustrated for reporting better
end-to-end packet delays. Results verified the effectiveness of our
Mathcad analytical manipulation and experimental implementation. It
showed lower values of end-to-end packet delay for Mobile-IP using
CIA procedure. Furthermore, it reported packets flow between
subnets to improve packet losses between subnets.
Abstract: This paper describes the optimization of a complex
dairy farm simulation model using two quite different methods of
optimization, the Genetic algorithm (GA) and the Lipschitz
Branch-and-Bound (LBB) algorithm. These techniques have been
used to improve an agricultural system model developed by Dexcel
Limited, New Zealand, which describes a detailed representation of
pastoral dairying scenarios and contains an 8-dimensional parameter
space. The model incorporates the sub-models of pasture growth and
animal metabolism, which are themselves complex in many cases.
Each evaluation of the objective function, a composite 'Farm
Performance Index (FPI)', requires simulation of at least a one-year
period of farm operation with a daily time-step, and is therefore
computationally expensive. The problem of visualization of the
objective function (response surface) in high-dimensional spaces is
also considered in the context of the farm optimization problem.
Adaptations of the sammon mapping and parallel coordinates
visualization are described which help visualize some important
properties of the model-s output topography. From this study, it is
found that GA requires fewer function evaluations in optimization
than the LBB algorithm.
Abstract: Societal security, continuity scenarios and methodological cycling approach explained in this article. Namely societal security organizational challenges ask implementation of international standards BS 25999-2 & global ISO 22300 which is a family of standards for business continuity management system. Efficient global organization system is distinguished of high entity´s complexity, connectivity & interoperability, having not only cooperative relations in a fact. Competing business have numerous participating ´enemies´, which are in apparent or hidden opponent and antagonistic roles with prosperous organization system, resulting to a crisis scene or even to a battle theatre. Organization business continuity scenarios are necessary for such ´a play´ preparedness, planning, management & overmastering in real environments.
Abstract: In this paper we present a general formalism for the
establishment of the family of selective regressor affine projection
algorithms (SR-APA). The SR-APA, the SR regularized APA (SR-RAPA),
the SR partial rank algorithm (SR-PRA), the SR binormalized
data reusing least mean squares (SR-BNDR-LMS), and the SR normalized
LMS with orthogonal correction factors (SR-NLMS-OCF)
algorithms are established by this general formalism. We demonstrate
the performance of the presented algorithms through simulations in
acoustic echo cancellation scenario.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: Smart Dust particles, are small smart materials used for generating weather maps. We investigate question of the optimal number of Smart Dust particles necessary for generating precise, computationally feasible and cost effective 3–D weather maps. We also give an optimal matching algorithm for the generalized scenario, when there are N Smart Dust particles and M ground receivers.
Abstract: When reconstructing a scenario, it is necessary to
know the structure of the elements present on the scene to have an
interpretation. In this work we link 3D scenes reconstruction to
evolutionary algorithms through the vision stereo theory. We
consider vision stereo as a method that provides the reconstruction of
a scene using only a couple of images of the scene and performing
some computation. Through several images of a scene, captured from
different positions, vision stereo can give us an idea about the threedimensional
characteristics of the world. Vision stereo usually
requires of two cameras, making an analogy to the mammalian vision
system. In this work we employ only a camera, which is translated
along a path, capturing images every certain distance. As we can not
perform all computations required for an exhaustive reconstruction,
we employ an evolutionary algorithm to partially reconstruct the
scene in real time. The algorithm employed is the fly algorithm,
which employ “flies" to reconstruct the principal characteristics of
the world following certain evolutionary rules.
Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: Lately, significant work in the area of Intelligent
Manufacturing has become public and mainly applied within the
frame of industrial purposes. Special efforts have been made in the
implementation of new technologies, management and control
systems, among many others which have all evolved the field. Aware
of all this and due to the scope of new projects and the need of
turning the existing flexible ideas into more autonomous and
intelligent ones, i.e.: Intelligent Manufacturing, the present paper
emerges with the main aim of contributing to the design and analysis
of the material flow in either systems, cells or work stations under
this new “intelligent" denomination. For this, besides offering a
conceptual basis in some of the key points to be taken into account
and some general principles to consider in the design and analysis of
the material flow, also some tips on how to define other possible
alternative material flow scenarios and a classification of the states a
system, cell or workstation are offered as well. All this is done with
the intentions of relating it with the use of simulation tools, for which
these have been briefly addressed with a special focus on the Witness
simulation package. For a better comprehension, the previous
elements are supported by a detailed layout, other figures and a few
expressions which could help obtaining necessary data. Such data and
others will be used in the future, when simulating the scenarios in the
search of the best material flow configurations.
Abstract: In this paper a stochastic scenario-based model predictive control applied to molten salt storage systems in concentrated solar tower power plant is presented. The main goal of this study is to build up a tool to analyze current and expected future resources for evaluating the weekly power to be advertised on electricity secondary market. This tool will allow plant operator to maximize profits while hedging the impact on the system of stochastic variables such as resources or sunlight shortage.
Solving the problem first requires a mixed logic dynamic modeling of the plant. The two stochastic variables, respectively the sunlight incoming energy and electricity demands from secondary market, are modeled by least square regression. Robustness is achieved by drawing a certain number of random variables realizations and applying the most restrictive one to the system. This scenario approach control technique provides the plant operator a confidence interval containing a given percentage of possible stochastic variable realizations in such a way that robust control is always achieved within its bounds. The results obtained from many trajectory simulations show the existence of a ‘’reliable’’ interval, which experimentally confirms the algorithm robustness.
Abstract: Planning the transition period for the adoption of
alternative fuel-technology powertrains is a challenging task that
requires sophisticated analysis tools. In this study, a system dynamic
approach was applied to analyze the bi-directional interaction
between the development of the refueling station network and vehicle
sales. Besides, the developed model was used to estimate the
transition cost to reach a predefined target (share of alternative fuel
vehicles) in different scenarios. Several scenarios have been analyzed
to investigate the effectiveness and cost of incentives on the initial
price of vehicles, and on the evolution of fuel and refueling stations.
Obtained results show that a combined set of incentives will be more
effective than just a single specific type of incentives.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: The liberalization and privatization processes have
forced public utility companies to face new competitive challenges,
implementing strategies to gain market share and, at the same time,
keep the old customers. To this end, many companies have carried
out mergers, acquisitions and conglomerations in order to diversify
their business. This paper focuses on companies operating in the free
energy market in Italy. In the last decade, this sector has undergone
profound changes that have radically changed the competitive
scenario and have led companies to implement diversification
strategies of the business. Our work aims to evaluate the economic
and financial performances obtained by energy companies, following
the beginning of the liberalization process, verifying the possible
relationship with the implemented diversification strategies.
Abstract: In this study, we explore the use of information for inventory decision in the healthcare organization (HO). We consider the scenario when the HO can make use of the information collected from some correlated products to enhance its inventory planning. Motivated by our real world observations that HOs adopt RFID and bar-coding system for information collection purpose, we examine the effectiveness of these systems for inventory planning with Bayesian information updating. We derive the optimal ordering decision and study the issue of Pareto improvement in the supply chain. Our analysis demonstrates that RFID system will outperform the bar-coding system when the RFID system installation cost and the tag cost reduce to a level that is comparable with that of the barcoding system. We also show how an appropriately set wholesale pricing contract can achieve Pareto improvement in the HO supply chain.
Abstract: The scenario of bypass transition is generally described
as follows: the low-frequency disturbances in the free-stream may
generate long stream-wise streaks in the boundary layer, which later
may trigger secondary instability, leading to rapid increase of
high-frequency disturbances. Then possibly turbulent spots emerge,
and through their merging, lead to fully developed turbulence. This
description, however, is insufficient in the sense that it does not
provide the inherent mechanism of transition that during the transition,
a large number of waves with different frequencies and wave numbers
appear almost simultaneously, producing sufficiently large Reynolds
stress, so the mean flow profile can change rapidly from laminar to
turbulent. In this paper, such a mechanism will be figured out from
analyzing DNS data of transition.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.
Abstract: A wireless Ad-hoc network consists of wireless nodes
communicating without the need for a centralized administration, in
which all nodes potentially contribute to the routing process.In this
paper, we report the simulation results of four different scenarios for
wireless ad hoc networks having thirty nodes. The performances of
proposed networks are evaluated in terms of number of hops per
route, delay and throughput with the help of OPNET simulator.
Channel speed 1 Mbps and simulation time 600 sim-seconds were
taken for all scenarios. For the above analysis DSR routing protocols
has been used. The throughput obtained from the above analysis
(four scenario) are compared as shown in Figure 3. The average
media access delay at node_20 for two routes and at node_20 for four
different scenario are compared as shown in Figures 4 and 5. It is
observed that the throughput will degrade when it will follow
different hops for same source to destination (i.e. it has dropped from
1.55 Mbps to 1.43 Mbps which is around 9.7%, and then dropped to
0.48Mbps which is around 35%).