Abstract: Understanding driving behavior is a complicated
researching topic. To describe accurate speed, flow and density of a
multiclass users traffic flow, an adequate model is needed. In this
study, we propose the concept of standard passenger car equivalent
(SPCE) instead of passenger car equivalent (PCE) to estimate the
influence of heavy vehicles and slow cars. Traffic cellular automata
model is employed to calibrate and validate the results. According to
the simulated results, the SPCE transformations present good
accuracy.
Abstract: In this paper, a new alignment method based on the particle swarm optimization (PSO) technique is presented. The PSO algorithm is used for locating the optimal coupling position with the highest optical power with three-degrees of freedom alignment. This algorithm gives an interesting results without a need to go thru the complex mathematical modeling of the alignment system. The proposed algorithm is validated considering practical tests considering the alignment of two Single Mode Fibers (SMF) and the alignment of SMF and PCF fibers.
Abstract: The aim of the article is extending and developing
econometrics and network structure based methods which are able to
distinguish price manipulation in Tehran stock exchange. The
principal goal of the present study is to offer model for
approximating price manipulation in Tehran stock exchange. In order
to do so by applying separation method a sample consisting of 397
companies accepted at Tehran stock exchange were selected and
information related to their price and volume of trades during years
2001 until 2009 were collected and then through performing runs
test, skewness test and duration correlative test the selected
companies were divided into 2 sets of manipulated and non
manipulated companies. In the next stage by investigating
cumulative return process and volume of trades in manipulated
companies, the date of starting price manipulation was specified and
in this way the logit model, artificial neural network, multiple
discriminant analysis and by using information related to size of
company, clarity of information, ratio of P/E and liquidity of stock
one year prior price manipulation; a model for forecasting price
manipulation of stocks of companies present in Tehran stock
exchange were designed. At the end the power of forecasting models
were studied by using data of test set. Whereas the power of
forecasting logit model for test set was 92.1%, for artificial neural
network was 94.1% and multi audit analysis model was 90.2%;
therefore all of the 3 aforesaid models has high power to forecast
price manipulation and there is no considerable difference among
forecasting power of these 3 models.
Abstract: The use of Virtual Reality (VR) in schools and higher education is proliferating. Due to its interactive and animated features, it is regarded as a promising technology to increase students- spatial ability. Spatial ability is assumed to have a prominent role in science and engineering domains. However, research concerning individual differences such as spatial ability in the context of VR is still at its infancy. Moreover, empirical studies that focus on the features of VR to improve spatial ability are to date rare. Thus, this paper explores the possible educational values of VR in relation to spatial ability to call for more research concerning spatial ability in the context of VR based on studies in computerbased learning. It is believed that the incorporation of state-of-the-art VR technology for educational purposes should be justified by the enhanced benefits for the target learners.
Abstract: As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Abstract: Supply chain consists of all stages involved, directly
or indirectly, includes all functions involved in fulfilling a customer
demand. In two stage transportation supply chain problem,
transportation costs are of a significant proportion of final product
costs. It is often crucial for successful decisions making approaches
in two stage supply chain to explicit account for non-linear
transportation costs. In this paper, deterministic demand and finite
supply of products was considered. The optimized distribution level
and the routing structure from the manufacturing plants to the
distribution centres and to the end customers is determined using
developed mathematical model and solved by proposed particle
swarm optimization based genetic algorithm. Numerical analysis of
the case study is carried out to validate the model.
Abstract: The paper presents coupled electromagnetic and
thermal field analysis of busbar system (of rectangular cross-section
geometry) submitted to short circuit conditions. The laboratory model
was validated against both analytical solution and experimental
observations. The considered problem required the computation of
the detailed distribution of the power losses and the heat transfer
modes. In this electromagnetic and thermal analysis, different
definitions of electric busbar heating were considered and compared.
The busbar system is a three phase one and consists of aluminum,
painted aluminum and copper busbar. The solution to the coupled
field problem is obtained using the finite element method and the
QuickField™ program. Experiments have been carried out using two
different approaches and compared with computed results.
Abstract: The choice of finite element to use in order to predict
nonlinear static or dynamic response of complex structures becomes
an important factor. Then, the main goal of this research work is to
focus a study on the effect of the in-plane rotational degrees of
freedom in linear and geometrically non linear static and dynamic
analysis of thin shell structures by flat shell finite elements. In this
purpose: First, simple triangular and quadrilateral flat shell finite
elements are implemented in an incremental formulation based on the
updated lagrangian corotational description for geometrically
nonlinear analysis. The triangular element is a combination of DKT
and CST elements, while the quadrilateral is a combination of DKQ
and the bilinear quadrilateral membrane element. In both elements,
the sixth degree of freedom is handled via introducing fictitious
stiffness. Secondly, in the same code, the sixth degrees of freedom in
these elements is handled differently where the in-plane rotational
d.o.f is considered as an effective d.o.f in the in-plane filed
interpolation. Our goal is to compare resulting shell elements. Third,
the analysis is enlarged to dynamic linear analysis by direct
integration using Newmark-s implicit method. Finally, the linear
dynamic analysis is extended to geometrically nonlinear dynamic
analysis where Newmark-s method is used to integrate equations of
motion and the Newton-Raphson method is employed for iterating
within each time step increment until equilibrium is achieved. The
obtained results demonstrate the effectiveness and robustness of the
interpolation of the in-plane rotational d.o.f. and present deficiencies
of using fictitious stiffness in dynamic linear and nonlinear analysis.
Abstract: Information Retrieval has the objective of studying
models and the realization of systems allowing a user to find the
relevant documents adapted to his need of information. The
information search is a problem which remains difficult because the
difficulty in the representing and to treat the natural languages such
as polysemia. Intentional Structures promise to be a new paradigm to
extend the existing documents structures and to enhance the different
phases of documents process such as creation, editing, search and
retrieval. The intention recognition of the author-s of texts can reduce
the largeness of this problem. In this article, we present intentions
recognition system is based on a semi-automatic method of
extraction the intentional information starting from a corpus of text.
This system is also able to update the ontology of intentions for the
enrichment of the knowledge base containing all possible intentions
of a domain. This approach uses the construction of a semi-formal
ontology which considered as the conceptualization of the intentional
information contained in a text. An experiments on scientific
publications in the field of computer science was considered to
validate this approach.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: Majority of pepper farmers in Malaysia are using the
open-sun method for drying the pepper berries. This method is time
consuming and exposed the berries to rain and contamination. A
maintenance-friendly and properly enclosed dryer is therefore
desired. A dryer design with a solar collector and a chimney was
studied and adapted to suit the needs of small-scale pepper farmers in
Malaysia. The dryer will provide an environment with an optimum
operating temperature meant for drying pepper berries. The dryer
model was evaluated by using commercially available computational
fluid dynamic (CFD) software in order to understand the heat and
mass transfer inside the dryer. Natural convection was the only mode
of heat transportation considered in this study as in accordance to the
idea of having a simple and maintenance-friendly design. To
accommodate the effect of low buoyancy found in natural convection
driers, a biomass burner was integrated into the solar dryer design.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Data mining incorporates a group of statistical
methods used to analyze a set of information, or a data set. It operates
with models and algorithms, which are powerful tools with the great
potential. They can help people to understand the patterns in certain
chunk of information so it is obvious that the data mining tools have
a wide area of applications. For example in the theoretical chemistry
data mining tools can be used to predict moleculeproperties or
improve computer-assisted drug design. Classification analysis is one
of the major data mining methodologies. The aim of thecontribution
is to create a classification model, which would be able to deal with a
huge data set with high accuracy. For this purpose logistic regression,
Bayesian logistic regression and random forest models were built
using R software. TheBayesian logistic regression in Latent GOLD
software was created as well. These classification methods belong to
supervised learning methods.
It was necessary to reduce data matrix dimension before construct
models and thus the factor analysis (FA) was used. Those models
were applied to predict the biological activity of molecules, potential
new drug candidates.
Abstract: This paper seeks to develop simple yet practical and
efficient control scheme that enables cooperating arms to handle a
flexible beam. Specifically the problem studied herein is that of two
arms rigidly grasping a flexible beam and such capable of generating
forces/moments in such away as to move a flexible beam along a
predefined trajectory. The paper develops a sliding mode control law
that provides robustness against model imperfection and uncertainty.
It also provides an implicit stability proof. Simulation results for two
three joint arms moving a flexible beam, are presented to validate the
theoretical results.
Abstract: In this paper, we develop a Spatio-Temporal graph as
of a key component of our knowledge representation Scheme. We
design an integrated representation Scheme to depict not only present
and past but future in parallel with the spaces in an effective and
intuitive manner. The resulting multi-dimensional comprehensive
knowledge structure accommodates multi-layered virtual world
developing in the time to maximize the diversity of situations in the
historical context. This knowledge representation Scheme is to be used
as the basis for simulation of situations composing the virtual world
and for implementation of virtual agents' knowledge used to judge and
evaluate the situations in the virtual world. To provide natural contexts
for situated learning or simulation games, the virtual stage set by this
Spatio-Temporal graph is to be populated by agents and other objects
interrelated and changing which are abstracted in the ontology.
Abstract: This paper illustrates the use of a combined neural
network model for classification of electrocardiogram (ECG) beats.
We present a trainable neural network ensemble approach to develop
customized electrocardiogram beat classifier in an effort to further
improve the performance of ECG processing and to offer
individualized health care.
We process a three stage technique for detection of premature
ventricular contraction (PVC) from normal beats and other heart
diseases. This method includes a denoising, a feature extraction and a
classification. At first we investigate the application of stationary
wavelet transform (SWT) for noise reduction of the
electrocardiogram (ECG) signals. Then feature extraction module
extracts 10 ECG morphological features and one timing interval
feature. Then a number of multilayer perceptrons (MLPs) neural
networks with different topologies are designed.
The performance of the different combination methods as well as
the efficiency of the whole system is presented. Among them,
Stacked Generalization as a proposed trainable combined neural
network model possesses the highest recognition rate of around 95%.
Therefore, this network proves to be a suitable candidate in ECG
signal diagnosis systems. ECG samples attributing to the different
ECG beat types were extracted from the MIT-BIH arrhythmia
database for the study.
Abstract: Wireless mobile communications have experienced
the phenomenal growth through last decades. The advances in
wireless mobile technologies have brought about a demand for high
quality multimedia applications and services. For such applications
and services to work, signaling protocol is required for establishing,
maintaining and tearing down multimedia sessions. The Session
Initiation Protocol (SIP) is an application layer signaling protocols,
based on request/response transaction model. This paper considers
SIP INVITE transaction over an unreliable medium, since it has been
recently modified in Request for Comments (RFC) 6026. In order to
help in assuring that the functional correctness of this modification is
achieved, the SIP INVITE transaction is modeled and analyzed using
Colored Petri Nets (CPNs). Based on the model analysis, it is
concluded that the SIP INVITE transaction is free of livelocks and
dead codes, and in the same time it has both desirable and
undesirable deadlocks. Therefore, SIP INVITE transaction should be
subjected for additional updates in order to eliminate undesirable
deadlocks. In order to reduce the cost of implementation and
maintenance of SIP, additional remodeling of the SIP INVITE
transaction is recommended.
Abstract: A number of routing algorithms based on learning
automata technique have been proposed for communication
networks. How ever, there has been little work on the effects of
variation of graph scarcity on the performance of these algorithms. In
this paper, a comprehensive study is launched to investigate the
performance of LASPA, the first learning automata based solution to
the dynamic shortest path routing, across different graph structures
with varying scarcities. The sensitivity of three main performance
parameters of the algorithm, being average number of processed
nodes, scanned edges and average time per update, to variation in
graph scarcity is reported. Simulation results indicate that the LASPA
algorithm can adapt well to the scarcity variation in graph structure
and gives much better outputs than the existing dynamic and fixed
algorithms in terms of performance criteria.
Abstract: In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as
weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services
with the Web-mined knowledge have begun to be developed for
the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be
problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore,
this paper introduces the simplest Web Sensor and spatiotemporallynormalized
Web Sensor to extract spatiotemporal data about a target
phenomenon from weblogs searched by keyword(s) representing the
target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity
analyses of coefficient correlation with temperature, rainfall, snowfall,
and earthquake statistics per day by region of Japan Meteorological
Agency as physical-world data: spatial granularity (region-s population
density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and
media granularity (weblogs vs. microblogs such as Tweets).
Abstract: Currently, web usage make a huge data from a lot of
user attention. In general, proxy server is a system to support web
usage from user and can manage system by using hit rates. This
research tries to improve hit rates in proxy system by applying data
mining technique. The data set are collected from proxy servers in the
university and are investigated relationship based on several features.
The model is used to predict the future access websites. Association
rule technique is applied to get the relation among Date, Time, Main
Group web, Sub Group web, and Domain name for created model.
The results showed that this technique can predict web content for the
next day, moreover the future accesses of websites increased from
38.15% to 85.57 %.
This model can predict web page access which tends to increase
the efficient of proxy servers as a result. In additional, the
performance of internet access will be improved and help to reduce
traffic in networks.