Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: Rapid enlargement and physical development of cities
have facilitated the emergence of a number of city life crises and
decrease of environment quality. Subsequently, the need for noticing
the concept of quality and its improvement in urban environments,
besides quantitative issues, is obviously recognized. In the domain of
urban ideas the importance of taking these issues into consideration
is obvious not only in accordance to sustainable development
concepts and improvement of public environment quality, but also in
the enhancement of social and behavioral models.
The major concern of present article is to study the nature of urban
environment quality in urban development plans, which is important
not only in the concept and the aim of projects but also in their
execution procedure. As a result, this paper is going to utilize
planning capacities caused by environmental virtues in the planning
procedure of Moft Abad neighborhood. Thus, at the first step,
applying the Analytical Hierarchy Process (AHP), it has assessed
quantitative environmental issues. The present conditions of Moft
Abad state that “the neighborhood is generally suffering from the
lack of qualitative parameters, and the previously formed planning
procedures could not take the sustainable and developmental paths
which are aimed at environment quality virtues." The diminution of
economical and environmental virtues has resulted in the diminution
of residential and social virtues. Therefore, in order to enhance the
environment quality in Moft Abad, the present paper has tried to
supply the subject plans in order to make a safe, healthy, and lively
neighborhood.
Abstract: The zero inflated models are usually used in modeling
count data with excess zeros where the existence of the excess zeros
could be structural zeros or zeros which occur by chance. These type
of data are commonly found in various disciplines such as finance,
insurance, biomedical, econometrical, ecology, and health sciences
which involve sex and health dental epidemiology. The most popular
zero inflated models used by many researchers are zero inflated
Poisson and zero inflated negative binomial models. In addition, zero
inflated generalized Poisson and zero inflated double Poisson models
are also discussed and found in some literature. Recently zero
inflated inverse trinomial model and zero inflated strict arcsine
models are advocated and proven to serve as alternative models in
modeling overdispersed count data caused by excessive zeros and
unobserved heterogeneity. The purpose of this paper is to review
some related literature and provide a variety of examples from
different disciplines in the application of zero inflated models.
Different model selection methods used in model comparison are
discussed.
Abstract: Nonlinear finite element method with eight noded
isoparametric quadrilateral element is used for prediction of loaddeformation
behavior including bearing capacity of foundations.
Modified generalized plasticity model with non-associated flow rule
is applied for analysis of soil-footing system. Also Von Mises and
Tresca criterions are used for simulation of soil behavior. Modified
generalized plasticity model is able to simulate load-deformation
including softening behavior. Localization phenomena are considered
by different meshes. Localization phenomena have not been seen in
the examples. Predictions by modified generalized plasticity model
show good agreement with laboratory data and theoretical prediction
in comparison the other models.
Abstract: The radiative exchange method is introduced as a
numerical method for the simulation of radiative heat transfer in an
absorbing, emitting and isotropically scattering media. In this
method, the integro-differential radiative balance equation is solved
by using a new introduced concept for the exchange factor. Even
though the radiative source term is calculated in a mesh structure that
is coarser than the structure used in computational fluid dynamics,
calculating the exchange factor between different coarse elements by
using differential integration elements makes the result of the method
close to that of integro-differential radiative equation. A set of
equations for calculating exchange factors in two and threedimensional
Cartesian coordinate system is presented, and the
method is used in the simulation of radiative heat transfer in twodimensional
rectangular case and a three-dimensional simple cube.
The result of using this method in simulating different cases is
verified by comparing them with those of using other numerical
radiative models.
Abstract: A 7-step method (with 25 sub-steps) to assess risk of
air pollutants is introduced. These steps are: pre-considerations,
sampling, statistical analysis, exposure matrix and likelihood, doseresponse
matrix and likelihood, total risk evaluation, and discussion
of findings. All mentioned words and expressions are wellunderstood;
however, almost all steps have been modified, improved,
and coupled in such a way that a comprehensive method has been
prepared. Accordingly, the SADRA (Statistical Analysis-Driven Risk
Assessment) emphasizes extensive and ongoing application of
analytical statistics in traditional risk assessment models. A Sulfur
Dioxide case study validates the claim and provides a good
illustration for this method.
Abstract: A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.
Abstract: MABENA model is a complementary model in
comparison with traditional models such as HCMS, CMS and etc.
New factors, which have effects on preparation of strategic plans and
their sequential order in MABENA model is the platform of
presented road map in this paper.Study review shows, factors such as
emerging new critical success factors for strategic planning,
improvement of international strategic models, increasing the
maturity of companies and emerging new needs leading to design a
new model which can be responsible for new critical factors and
solve the limitations of previous strategic management models.
Preparation of strategic planning need more factors than introduced
in traditional models. The needed factors includes determining future
Critical Success Factors and competencies, defining key processes,
determining the maturity of the processes, considering all aspects of
the external environment etc. Description of aforementioned
requirements, the outcomes and their order is developing and
presenting the MABENA model-s road map in this paper. This study
presents a road map for strategic planning of the Iranian
organizations.
Abstract: Wind power is among the most actively developing distributed generation (DG) technology. Majority of the wind power based DG technologies employ wind turbine induction generators (WTIG) instead of synchronous generators, for the technical advantages like: reduced size, increased robustness, lower cost, and increased electromechanical damping. However, dynamic changes of wind speed make the amount of active/reactive power injected/drawn to a WTIG embedded distribution network highly variable. This paper analyzes the effect of wind speed changes on the active and reactive power penetration to the wind energy embedded distribution network. Four types of wind speed changes namely; constant, linear change, gust change and random change of wind speed are considered in the analysis. The study is carried out by three-phase, non-linear, dynamic simulation of distribution system component models. Results obtained from the investigation are presented and discussed.
Abstract: Stick models are widely used in studying the
behaviour of straight as well as skew bridges and viaducts subjected
to earthquakes while carrying out preliminary studies. The
application of such models to highly curved bridges continues to
pose challenging problems. A viaduct proposed in the foothills of the
Himalayas in Northern India is chosen for the study. It is having 8
simply supported spans @ 30 m c/c. It is doubly curved in horizontal
plane with 20 m radius. It is inclined in vertical plane as well. The
superstructure consists of a box section. Three models have been
used: a conventional stick model, an improved stick model and a 3D
finite element model. The improved stick model is employed by
making use of body constraints in order to study its capabilities. The
first 8 frequencies are about 9.71% away in the latter two models.
Later the difference increases to 80% in 50th mode. The viaduct was
subjected to all three components of the El Centro earthquake of May
1940. The numerical integration was carried out using the Hilber-
Hughes-Taylor method as implemented in SAP2000. Axial forces
and moments in the bridge piers as well as lateral displacements at
the bearing levels are compared for the three models. The maximum
difference in the axial forces and bending moments and
displacements vary by 25% between the improved and finite element
model. Whereas, the maximum difference in the axial forces,
moments, and displacements in various sections vary by 35%
between the improved stick model and equivalent straight stick
model. The difference for torsional moment was as high as 75%. It is
concluded that the stick model with body constraints to model the
bearings and expansion joints is not desirable in very sharp S curved
viaducts even for preliminary analysis. This model can be used only
to determine first 10 frequency and mode shapes but not for member
forces. A 3D finite element analysis must be carried out for
meaningful results.
Abstract: The major part of light weight timber constructions
consists of insulation. Mineral wool is the most commonly used
insulation due to its cost efficiency and easy handling. The fiber
orientation and porosity of this insulation material enables flowthrough.
The air flow resistance is low. If leakage occurs in the
insulated bay section, the convective flow may cause energy losses
and infiltration of the exterior wall with moisture and particles. In
particular the infiltrated moisture may lead to thermal bridges and
growth of health endangering mould and mildew. In order to prevent
this problem, different numerical calculation models have been
developed. All models developed so far have a potential for
completion. The implementation of the flow-through properties of
mineral wool insulation may help to improve the existing models.
Assuming that the real pressure difference between interior and
exterior surface is larger than the prescribed pressure difference in the
standard test procedure for mineral wool ISO 9053 / EN 29053,
measurements were performed using the measurement setup for
research on convective moisture transfer “MSRCMT".
These measurements show, that structural inhomogeneities of
mineral wool effect the permeability only at higher pressure
differences, as applied in MSRCMT. Additional microscopic
investigations show, that the location of a leak within the
construction has a crucial influence on the air flow-through and the
infiltration rate. The results clearly indicate that the empirical values
for the acoustic resistance of mineral wool should not be used for the
calculation of convective transfer mechanisms.
Abstract: In this study we investigate silica nanoparticle (SiO2- NP) effects on the structure and phase properties of supported lipid monolayers and bilayers, coupling surface pressure measurements, fluorescence microscopy and atomic force microscopy. SiO2-NPs typically in size range of 10nm to 100 nm in diameter are tested. Our results suggest first that lipid molecules organization depends to their nature. Secondly, lipid molecules in the vinicity of big aggregates nanoparticles organize in liquid condensed phase whereas small aggregates are localized in both fluid liquid-expanded (LE) and liquid-condenced (LC). We demonstrated also by atomic force microscopy that by measuring friction forces it is possible to get information as if nanoparticle aggregates are recovered or not by lipid monolayers and bilayers.
Abstract: The purpose of this study is mainly to predict collision
frequency on the horizontal tangents combined with vertical curves
using artificial neural network methods. The proposed ANN models
are compared with existing regression models. First, the variables
that affect collision frequency were investigated. It was found that
only the annual average daily traffic, section length, access density,
the rate of vertical curvature, smaller curve radius before and after
the tangent were statistically significant according to related
combinations. Second, three statistical models (negative binomial,
zero inflated Poisson and zero inflated negative binomial) were
developed using the significant variables for three alignment
combinations. Third, ANN models are developed by applying the
same variables for each combination. The results clearly show that
the ANN models have the lowest mean square error value than those
of the statistical models. Similarly, the AIC values of the ANN
models are smaller to those of the regression models for all the
combinations. Consequently, the ANN models have better statistical
performances than statistical models for estimating collision
frequency. The ANN models presented in this paper are
recommended for evaluating the safety impacts 3D alignment
elements on horizontal tangents.
Abstract: The purpose of this paper is to demonstrate the ability
of a genetic programming (GP) algorithm to evolve a team of data
classification models. The GP algorithm used in this work is
“multigene" in nature, i.e. there are multiple tree structures (genes)
that are used to represent team members. Each team member assigns
a data sample to one of a fixed set of output classes. A majority vote,
determined using the mode (highest occurrence) of classes predicted
by the individual genes, is used to determine the final class
prediction. The algorithm is tested on a binary classification problem.
For the case study investigated, compact classification models are
obtained with comparable accuracy to alternative approaches.
Abstract: Present paper presents a parametric performancebased
design model for optimizing hospital design. The design model
operates with geometric input parameters defining the functional
requirements of the hospital and input parameters in terms of
performance objectives defining the design requirements and
preferences of the hospital with respect to performances. The design
model takes point of departure in the hospital functionalities as a set
of defined parameters and rules describing the design requirements
and preferences.
Abstract: In this paper, we evaluate the performance of some wavelet based coding algorithms such as 3D QT-L, 3D SPIHT and JPEG2K. In the first step we achieve an objective comparison between three coders, namely 3D SPIHT, 3D QT-L and JPEG2K. For this purpose, eight MRI head scan test sets of 256 x 256x124 voxels have been used. Results show superior performance of 3D SPIHT algorithm, whereas 3D QT-L outperforms JPEG2K. The second step consists of evaluating the robustness of 3D SPIHT and JPEG2K coding algorithm over wireless transmission. Compressed dataset images are then transmitted over AWGN wireless channel or over Rayleigh wireless channel. Results show the superiority of JPEG2K over these two models. In fact, it has been deduced that JPEG2K is more robust regarding coding errors. Thus we may conclude the necessity of using corrector codes in order to protect the transmitted medical information.
Abstract: In many data mining applications, it is a priori known
that the target function should satisfy certain constraints imposed
by, for example, economic theory or a human-decision maker. In this
paper we consider partially monotone prediction problems, where the
target variable depends monotonically on some of the input variables
but not on all. We propose a novel method to construct prediction
models, where monotone dependences with respect to some of
the input variables are preserved by virtue of construction. Our
method belongs to the class of mixture models. The basic idea is to
convolute monotone neural networks with weight (kernel) functions
to make predictions. By using simulation and real case studies,
we demonstrate the application of our method. To obtain sound
assessment for the performance of our approach, we use standard
neural networks with weight decay and partially monotone linear
models as benchmark methods for comparison. The results show that
our approach outperforms partially monotone linear models in terms
of accuracy. Furthermore, the incorporation of partial monotonicity
constraints not only leads to models that are in accordance with the
decision maker's expertise, but also reduces considerably the model
variance in comparison to standard neural networks with weight
decay.
Abstract: The purpose of this research is to establish the experimental conditions for removal of Cibacron Brilliant Yellow 3G-P dye (CBY) from aqueous solutions by sorption onto coffee husks as a low-cost sorbent. The effects of various experimental parameters (e.g. initial CBY dye concentration, sorbent mass, pH, temperature) were examined and the optimal experimental conditions were determined. The results indicated that the removal of the dye was pH dependent and at initial pH of 2, the dye was removed effectively. The CBY dye sorption data were fitted to Langmuir, Freundlich, Temkin and Dubinin-Radushkevich equilibrium models. The maximum sorption capacity of CBY dye ions onto coffee husks increased from 24.04 to 35.04 mg g-1 when the temperature was increased from 293 to 313 K. The calculated sorption thermodynamic parameters including ΔG°, ΔH°, and ΔS° indicated that the CBY dye sorption onto coffee husks is a spontaneous, endothermic and mainly physical in nature.
Abstract: Liquidity risk management ranks to key concepts
applied in finance. Liquidity is defined as a capacity to obtain
funding when needed, while liquidity risk means as a threat to this
capacity to generate cash at fair costs. In the paper we present
challenges of liquidity risk management resulting from the 2007-
2009 global financial upheaval. We see five main regulatory
liquidity risk management issues requiring revision in coming
years: liquidity measurement, intra-day and intra-group liquidity
management, contingency planning and liquidity buffers, liquidity
systems, controls and governance, and finally models testing the
viability of business liquidity models.
Abstract: Empirical force fields and density functional theory
(DFT) was used to study the binding energies and structures of
methylamine on the surface of activated carbons (ACs). This is a first
step in studying the adsorption of alkyl amines on the surface of
functionalized ACs. The force fields used were Dreiding (DFF),
Universal (UFF) and Compass (CFF) models. The generalized
gradient approximation with Perdew Wang 91 (PW91) functional
was used for DFT calculations. In addition to obtaining the aminecarboxylic
acid adsorption energies, the results were used to establish
reliability of the empirical models for these systems. CFF predicted a
binding energy of -9.227 (kcal/mol) which agreed with PW91 at -
13.17 (kcal/mol), compared to DFF 0 (kcal/mol) and UFF -0.72
(kcal/mol). However, the CFF binding energies for the amine to ester
and ketone disagreed with PW91 results. The structures obtained
from all models agreed with PW91 results.