Abstract: Identifying and classifying intersections according to
severity is very important for implementation of safety related
counter measures and effective models are needed to compare and
assess the severity. Highway safety organizations have considered
intersection safety among their priorities. In spite of significant
advances in highways safety, the large numbers of crashes with high
severities still occur in the highways. Investigation of influential
factors on crashes enables engineers to carry out calculations in order
to reduce crash severity. Previous studies lacked a model capable of
simultaneous illustration of the influence of human factors, road,
vehicle, weather conditions and traffic features including traffic
volume and flow speed on the crash severity. Thus, this paper is
aimed at developing the models to illustrate the simultaneous
influence of these variables on the crash severity in urban highways.
The models represented in this study have been developed using
binary Logit Models. SPSS software has been used to calibrate the
models. It must be mentioned that backward regression method in
SPSS was used to identify the significant variables in the model.
Consider to obtained results it can be concluded that the main
factor in increasing of crash severity in urban highways are driver
age, movement with reverse gear, technical defect of the vehicle,
vehicle collision with motorcycle and bicycle, bridge, frontal impact
collisions, frontal-lateral collisions and multi-vehicle crashes in
urban highways which always increase the crash severity in urban
highways.
Abstract: This paper presents a CFD analysis of the flow field
around a thin flat plate of infinite span inclined at 90° to a fluid
stream of infinite extent. Numerical predictions have been compared
to experimental measurements, in order to assess the potential of the
finite volume code of determining the aerodynamic forces acting on a
bluff body invested by a fluid stream of infinite extent.
Several turbulence models and spatial node distributions have
been tested. Flow field characteristics in the neighborhood of the flat
plate have been investigated, allowing the development of a
preliminary procedure to be used as guidance in selecting the
appropriate grid configuration and the corresponding turbulence
model for the prediction of the flow field over a two-dimensional
vertical flat plate.
Abstract: This paper deals with behavior and capacity of
punching shear force for flat slabs produced from steel fiber
reinforced self compacting concrete (SFRSCC) by application
nonlinear finite element method. Nonlinear finite element analysis on
nine slab specimens was achieved by using ANSYS software. A
general description of the finite element method, theoretical modeling
of concrete and reinforcement are presented. The nonlinear finite
element analysis program ANSYS is utilized owing to its capabilities
to predict either the response of reinforced concrete slabs in the post
elastic range or the ultimate strength of a flat slabs produced from
steel fiber reinforced self compacting concrete (SFRSCC). In order to
verify the analytical model used in this research using test results of
the experimental data, the finite element analysis were performed
then a parametric study of the effect ratio of flexural reinforcement,
ratio of the upper reinforcement, and volume fraction of steel fibers
were investigated. A comparison between the experimental results
and those predicted by the existing models are presented. Results and
conclusions may be useful for designers, have been raised, and
represented.
Abstract: Turbulence modeling of large-scale flow over a vegetated surface is complex. Such problems involve large scale computational domains, while the characteristics of flow near the surface are also involved. In modeling large scale flow, surface roughness including vegetation is generally taken into account by mean of roughness parameters in the modified law of the wall. However, the turbulence structure within the canopy region cannot be captured with this method, another method which applies source/sink terms to model plant drag can be used. These models have been developed and tested intensively but with a simple surface geometry. This paper aims to compare the use of roughness parameter, and additional source/sink terms in modeling the effect of plant drag on wind flow over a complex vegetated surface. The RNG k-ε turbulence model with the non-equilibrium wall function was tested with both cases. In addition, the k-ω turbulence model, which is claimed to be computationally stable, was also investigated with the source/sink terms. All numerical results were compared to the experimental results obtained at the study site Mason Bay, Stewart Island, New Zealand. In the near-surface region, it is found that the results obtained by using the source/sink term are more accurate than those using roughness parameters. The k-ω turbulence model with source/sink term is more appropriate as it is more accurate and more computationally stable than the RNG k-ε turbulence model. At higher region, there is no significant difference amongst the results obtained from all simulations.
Abstract: The paper compares different channel models used for
modeling Broadband Power-Line Communication (BPLC) system.
The models compared are Zimmermann and Dostert, Philipps,
Anatory et al and Anatory et al generalized Transmission Line (TL)
model. The validity of each model was compared in time domain
with ATP-EMTP software which uses transmission line approach. It
is found that for a power-line network with minimum number of
branches all the models give similar signal/pulse time responses
compared with ATP-EMTP software; however, Zimmermann and
Dostert model indicates the same amplitude but different time delay.
It is observed that when the numbers of branches are increased only
generalized TL theory approach results are comparable with ATPEMTP
results. Also the Multi-Carrier Spread Spectrum (MC-SS)
system was applied to check the implication of such behavior on the
modulation schemes. It is observed that using Philipps on the
underground cable can predict the performance up to 25dB better
than other channel models which can misread the actual performance
of the system. Also modified Zimmermann and Dostert under
multipath can predict a better performance of about 5dB better than
the actual predicted by Generalized TL theory. It is therefore
suggested for a realistic BPLC system design and analyses the model
based on generalized TL theory be used.
Abstract: Estimation of natural frequency of structures is very
important and isn-t usually calculated simply and sometimes
complicated. Lack of knowledge about that caused hard damage and
hazardous effects.
In this paper, with using from two different models in FEM
method and based on hydrodynamic mass of fluids, natural frequency
of an especial bearing (Fig. 1) in an electric field (or, a periodic
force) is calculated in different stiffness and different geometric. In
final, the results of two models and analytical solution are compared.
Abstract: In recent years, copulas have become very popular in
financial research and actuarial science as they are more flexible in
modelling the co-movements and relationships of risk factors as compared
to the conventional linear correlation coefficient by Pearson.
However, a precise estimation of the copula parameters is vital in
order to correctly capture the (possibly nonlinear) dependence structure
and joint tail events. In this study, we employ two optimization
heuristics, namely Differential Evolution and Threshold Accepting to
tackle the parameter estimation of multivariate t distribution models
in the EML approach. Since the evolutionary optimizer does not rely
on gradient search, the EML approach can be applied to estimation of
more complicated copula models such as high-dimensional copulas.
Our experimental study shows that the proposed method provides
more robust and more accurate estimates as compared to the IFM
approach.
Abstract: The paper deals with the most important changes that have occurred in business because of social media and its impact on organisations and leadership in recent years. It seeks to synthesize existing research, theories and concepts, in order to understand "social destinations", and to provide a bridge from past research to future success. Becoming a "social destination" is a strategic and tactical leadership and management issue and the paper will present the importance of destination leadership in choosing the way towards a social destination and some organisational models. It also presents some social media tools that can be used in transforming a destination into a social one. Adapting organisations to the twentyfirst century means adopting social media as a way of life and a way of business.
Abstract: This paper discusses the design characteristics management accounting systems should have to be useful for strategic planning and control and provides brief introductions to strategic variance analysis, profit-linked performance measurement models and balanced scorecard. It shows two multi-period, multiproduct models are specified, can be related to Porter's strategy framework and cost and revenue drivers, and can be used to support strategic planning, control and cost management.
Abstract: Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.
Abstract: Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.
Abstract: The problem of estimating time-varying regression is
inevitably concerned with the necessity to choose the appropriate
level of model volatility - ranging from the full stationarity of instant
regression models to their absolute independence of each other. In the
stationary case the number of regression coefficients to be estimated
equals that of regressors, whereas the absence of any smoothness
assumptions augments the dimension of the unknown vector by the
factor of the time-series length. The Akaike Information Criterion
is a commonly adopted means of adjusting a model to the given
data set within a succession of nested parametric model classes,
but its crucial restriction is that the classes are rigidly defined by
the growing integer-valued dimension of the unknown vector. To
make the Kullback information maximization principle underlying the
classical AIC applicable to the problem of time-varying regression
estimation, we extend it onto a wider class of data models in which
the dimension of the parameter is fixed, but the freedom of its values
is softly constrained by a family of continuously nested a priori
probability distributions.
Abstract: The purpose of this paper is to perform a multidisciplinary design and analysis (MDA) of honeycomb panels used in the satellites structural design. All the analysis is based on clamped-free boundary conditions. In the present work, detailed finite element models for honeycomb panels are developed and analysed. Experimental tests were carried out on a honeycomb specimen of which the goal is to compare the previous modal analysis made by the finite element method as well as the existing equivalent approaches. The obtained results show a good agreement between the finite element analysis, equivalent and tests results; the difference in the first two frequencies is less than 4% and less than 10% for the third frequency. The results of the equivalent model presented in this analysis are obtained with a good accuracy. Moreover, investigations carried out in this research relate to the honeycomb plate modal analysis under several aspects including the structural geometrical variation by studying the various influences of the dimension parameters on the modal frequency, the variation of core and skin material of the honeycomb. The various results obtained in this paper are promising and show that the geometry parameters and the type of material have an effect on the value of the honeycomb plate modal frequency.
Abstract: Several models have been introduced so far for single
electron box, SEB, which all of them were restricted to DC response
and or low temperature limit. In this paper we introduce a new time
dependent, high temperature analytical model for SEB for the first
time. DC behavior of the introduced model will be verified against
SIMON software and its time behavior will be verified against a
newly published paper regarding step response of SEB.
Abstract: In this paper the design of maximally flat linear phase
finite impulse response (FIR) filters is considered. The problem is
handled with totally two different approaches. The first one is
completely deterministic numerical approach where the problem is
formulated as a Linear Complementarity Problem (LCP). The other
one is based on a combination of Markov Random Fields (MRF's)
approach with messy genetic algorithm (MGA). Markov Random
Fields (MRFs) are a class of probabilistic models that have been
applied for many years to the analysis of visual patterns or textures.
Our objective is to establish MRFs as an interesting approach to
modeling messy genetic algorithms. We establish a theoretical result
that every genetic algorithm problem can be characterized in terms of
a MRF model. This allows us to construct an explicit probabilistic
model of the MGA fitness function and introduce the Ising MGA.
Experimentations done with Ising MGA are less costly than those
done with standard MGA since much less computations are involved.
The least computations of all is for the LCP. Results of the LCP,
random search, random seeded search, MGA, and Ising MGA are
discussed.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: In this paper, the requirement for Coke quality
prediction, its role in Blast furnaces, and the model output is
explained. By applying method of Artificial Neural Networking
(ANN) using back propagation (BP) algorithm, prediction model has
been developed to predict CSR. Important blast furnace functions
such as permeability, heat exchanging, melting, and reducing
capacity are mostly connected to coke quality. Coke quality is further
dependent upon coal characterization and coke making process
parameters. The ANN model developed is a useful tool for process
experts to adjust the control parameters in case of coke quality
deviations. The model also makes it possible to predict CSR for new
coal blends which are yet to be used in Coke Plant. Input data to the
model was structured into 3 modules, for tenure of past 2 years and
the incremental models thus developed assists in identifying the
group causing the deviation of CSR.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.