Abstract: Social-economic variables influence transportation
demand largely. Analyses of discrete choice model consider
social-economic variables to study traveler-s mode choice and
demand. However, to calibrate the discrete choice model needs to have
plenty of questionnaire survey. Also, an aggregative model is
proposed. The historical data of passenger volumes for high speed rail
and domestic civil aviation are employed to calibrate and validate the
model. In this study, models with different social-economic variables,
which are oil price, GDP per capita, CPI and economic growth rate,
are compared. From the results, the model with the oil price is better
than models with the other social-economic variables.
Abstract: The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.
Abstract: This article discusses the problem of estimating the
orientation of inclined ground on which a human subject stands based
on information provided by the vestibular system consisting of the
otolith and semicircular canals. It is assumed that body segments are
not necessarily aligned and thus forming an open kinematic chain.
The semicircular canals analogues to a technical gyrometer provide a
measure of the angular velocity whereas the otolith analogues to a
technical accelerometer provide a measure of the translational
acceleration. Two solutions are proposed and discussed. The first is
based on a stand-alone Kalman filter that optimally fuses the two
measurements based on their dynamic characteristics and their noise
properties. In this case, no body dynamic model is needed. In the
second solution, a central extended disturbance observer that
incorporates a body dynamic model (internal model) is employed.
The merits of both solutions are discussed and demonstrated by
experimental and simulation results.
Abstract: Constant amplitude fatigue crack growth (FCG) tests
were performed on dissimilar metal welded plates of Type 316L
Stainless Steel (SS) and IS 2062 Grade A Carbon steel (CS). The
plates were welded by TIG welding using SS E309 as electrode. FCG
tests were carried on the Side Edge Notch Tension (SENT)
specimens of 5 mm thickness, with crack initiator (notch) at base
metal region (BM), weld metal region (WM) and heat affected zones
(HAZ). The tests were performed at a test frequency of 10 Hz and at
load ratios (R) of 0.1 & 0.6. FCG rate was found to increase with
stress ratio for weld metals and base metals, where as in case of
HAZ, FCG rates were almost equal at high ΔK. FCG rate of HAZ of
stainless steel was found to be lowest at low and high ΔK. At
intermediate ΔK, WM showed the lowest FCG rate. CS showed
higher crack growth rate at all ΔK. However, the scatter band of data
was found to be narrow. Fracture toughness (Kc) was found to vary
in different locations of weldments. Kc was found lowest for the
weldment and highest for HAZ of stainless steel. A novel method of
characterizing the FCG behavior using an Infrared thermography
(IRT) camera was attempted. By monitoring the temperature rise at
the fast moving crack tip region, the amount of plastic deformation
was estimated.
Abstract: A Simultaneous Multithreading (SMT) Processor is
capable of executing instructions from multiple threads in the same
cycle. SMT in fact was introduced as a powerful architecture to
superscalar to increase the throughput of the processor.
Simultaneous Multithreading is a technique that permits multiple
instructions from multiple independent applications or threads to
compete limited resources each cycle. While the fetch unit has been
identified as one of the major bottlenecks of SMT architecture, several
fetch schemes were proposed by prior works to enhance the fetching
efficiency and overall performance.
In this paper, we propose a novel fetch policy called queue situation
identifier (QSI) which counts some kind of long latency instructions of
each thread each cycle then properly selects which threads to fetch
next cycle. Simulation results show that in best case our fetch policy
can achieve 30% on speedup and also can reduce the data cache level 1
miss rate.
Abstract: A simple network model is developed in OPNET to
study the performance of the Wi-Fi protocol. The model is simulated
in OPNET and performance factors such as load, throughput and delay
are analysed from the model. Four applications such as oracle, http, ftp
and voice are applied over the Wireless LAN network to determine the
throughput. The voice application utilises a considerable amount of
bandwidth of up to 5Mbps, as a result the 802.11g standard of the
Wi-Fi protocol was chosen which can support a data rate of up to
54Mbps. Results indicate that when the load in the Wi-Fi network is
increased the queuing delay on the point-to-point links in the Wi-Fi
network significantly reduces until it is comparable to that of WiMAX.
In conclusion, the queuing delay of the Wi-Fi protocol for the network
model simulated was about 0.00001secs comparable to WiMAX
network values.
Abstract: Acute toxicity of nano SiO2, ZnO, MCM-41 (Meso
pore silica), Cu, Multi Wall Carbon Nano Tube (MWCNT), Single
Wall Carbon Nano Tube (SWCNT) , Fe (Coated) to bacteria Vibrio
fischeri using a homemade luminometer , was evaluated. The values
of the nominal effective concentrations (EC), causing 20% and 50%
inhibition of biouminescence, using two mathematical models at two
times of 5 and 30 minutes were calculated. Luminometer was
designed with Photomultiplier (PMT) detector. Luminol
chemiluminescence reaction was carried out for the calibration graph.
In the linear calibration range, the correlation coefficients and
coefficient of Variation (CV) were 0.988 and 3.21% respectively
which demonstrate the accuracy and reproducibility of the instrument
that are suitable. The important part of this research depends on how
to optimize the best condition for maximum bioluminescence. The
culture of Vibrio fischeri with optimal conditions in liquid media,
were stirring at 120 rpm at a temperature of 150C to 180C and were
incubated for 24 to 72 hours while solid medium was held at 180C
and for 48 hours. Suspension of nanoparticles ZnO, after 30 min
contact time to bacteria Vibrio fischeri, showed the highest toxicity
while SiO2 nanoparticles showed the lowest toxicity. After 5 min
exposure time, the toxicity of ZnO was the strongest and MCM-41
was the weakest toxicant component.
Abstract: Homogeneous composites of alumina and zirconia
with a small amount of MgO (99%) were obtained for ZTA ceramic containing 0.05 wt% MgO in
1500 °C.
Abstract: Working memory (WM) can be defined as the system
which actively holds information in the mind to do tasks in spite of
the distraction. Contrary, short-term memory (STM) is a system that
represents the capacity for the active storing of information without
distraction. There has been accumulating evidence that these types of
memory are related to higher cognition (HC). The aim of this study
was to verify the relationship between HC and memory (visual STM
and WM, auditory STM and WM). 59 primary school children were
tested by intelligence test, mathematical tasks (HC) and memory
subtests. We have shown that visual but not auditory memory is a
significant predictor of higher cognition. The relevance of these
results are discussed.
Abstract: The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The current paper presents an approach for still image digital watermarking in which the watermark embedding process employs the wavelet transform and incorporates Human Visual System (HVS) characteristics. The sensitivity of a human observer to contrast with respect to spatial frequency is described by the Contrast Sensitivity Function (CSF). The strength of the watermark within the decomposition subbands, which occupy an interval on the spatial frequencies, is adjusted according to this sensitivity. Moreover, the watermark embedding process is carried over the subband coefficients that lie on edges where distortions are less noticeable. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency.
Abstract: This research proposes the state of art on how to control or find the trajectory paths of the RRP robot when the prismatic joint is malfunction. According to this situation, the minimum energy of the dynamic optimization is applied. The RRP robot or similar systems have been used in many areas such as fire fighter truck, laboratory equipment and military truck for example a rocket launcher. In order to keep on task that assigned, the trajectory paths must be computed. Here, the open loop control is applied and the result of an example show the reasonable solution which can be applied to the controllable system.
Abstract: Biodiversity crisis is one of the many crises that
started at the turn of the millennia. Concrete form of expression is
still disputed, but there is a relatively high consensus regarding the
high rate of degradation and the urgent need for action. The strategy
of action outlines a strong economic component, together with the
recognition of market mechanisms as the most effective policies to
protect biodiversity. In this context, biodiversity and ecosystem
services are natural assets that play a key role in economic strategies
and technological development to promote development and
prosperity. Developing and strengthening policies for transition to an
economy based on efficient use of resources is the way forward.
To emphasize the co-viability specific to the connection economyecosystem
services, scientific approach aimed on one hand how to
implement policies for nature conservation and on the other hand, the
concepts underlying the economic expression of ecosystem services-
value, in the context of current technology. Following the analysis of
business opportunities associated with changes in ecosystem services
was concluded that development of market mechanisms for nature
conservation is a trend that is increasingly stronger individualized
within recent years. Although there are still many controversial issues
that have already given rise to an obvious bias, international
organizations and national governments have initiated and
implemented in cooperation or independently such mechanisms.
Consequently, they created the conditions for convergence between
private interests and social interests of nature conservation, so there
are opportunities for ongoing business development which leads,
among other things, the positive effects on biodiversity. Finally,
points out that markets fail to quantify the value of most ecosystem
services. Existing price signals reflect at best, only a proportion of the
total amount corresponding provision of food, water or fuel.
Abstract: In this paper we describes the authentication for DHCP
(Dynamic Host Configuration Protocol) message which provides the
efficient key management and reduces the danger replay attack without
an additional packet for a replay attack. And the authentication for
DHCP message supports mutual authentication and provides both
entity authentication and message authentication. We applied the
authentication for DHCP message to the home network environments
and tested through a home gateway.
Abstract: Quantum cryptography offers a way of key agreement,
which is unbreakable by any external adversary. Authentication is
of crucial importance, as perfect secrecy is worthless if the identity
of the addressee cannot be ensured before sending important information.
Message authentication has been studied thoroughly, but no
approach seems to be able to explicitly counter meet-in-the-middle
impersonation attacks. The goal of this paper is the development of
an authentication scheme being resistant against active adversaries
controlling the communication channel. The scheme is built on top
of a key-establishment protocol and is unconditionally secure if built
upon quantum cryptographic key exchange. In general, the security
is the same as for the key-agreement protocol lying underneath.
Abstract: Simulations are developed in this paper with usual DSGE model equations. The model is based on simplified version of Smets-Wouters equations in use at European Central Bank which implies 10 macro-economic variables: consumption, investment, wages, inflation, capital stock, interest rates, production, capital accumulation, labour and credit rate, and allows take into consideration the banking system. Throughout the simulations, this model will be used to evaluate the impact of rate shocks recounting the actions of the European Central Bank during 2008.
Abstract: An adaptive fuzzy PID controller with gain scheduling is proposed in this paper. The structure of the proposed gain scheduled fuzzy PID (GS_FPID) controller consists of both fuzzy PI-like controller and fuzzy PD-like controller. Both of fuzzy PI-like and PD-like controllers are weighted through adaptive gain scheduling, which are also determined by fuzzy logic inference. A modified genetic algorithm called accumulated genetic algorithm is designed to learn the parameters of fuzzy inference system. In order to learn the number of fuzzy rules required for the TSK model, the fuzzy rules are learned in an accumulated way. In other words, the parameters learned in the previous rules are accumulated and updated along with the parameters in the current rule. It will be shown that the proposed GS_FPID controllers learned by the accumulated GA perform well for not only the regular linear systems but also the higher order and time-delayed systems.
Abstract: Currently, most of distance learning courses can only
deliver standard material to students. Students receive course content
passively which leads to the neglect of the goal of education – “to suit
the teaching to the ability of students". Providing appropriate course
content according to students- ability is the main goal of this paper.
Except offering a series of conventional learning services, abundant
information available, and instant message delivery, a complete online
learning environment should be able to distinguish between students-
ability and provide learning courses that best suit their ability.
However, if a distance learning site contains well-designed course
content and design but fails to provide adaptive courses, students will
gradually loss their interests and confidence in learning and result in
ineffective learning or discontinued learning. In this paper, an
intelligent tutoring system is proposed and it consists of several
modules working cooperatively in order to build an adaptive learning
environment for distance education. The operation of the system is
based on the result of Self-Organizing Map (SOM) to divide students
into different groups according to their learning ability and learning
interests and then provide them with suitable course content.
Accordingly, the problem of information overload and internet traffic
problem can be solved because the amount of traffic accessing the
same content is reduced.
Abstract: Reachability graph (RG) generation suffers from the
problem of exponential space and time complexity. To alleviate the
more critical problem of time complexity, this paper presents the new
approach for RG generation for the Petri net (PN) models of parallel
processes. Independent RGs for each parallel process in the PN
structure are generated in parallel and cross-product of these RGs
turns into the exhaustive state space from which the RG of given
parallel system is determined. The complexity analysis of the
presented algorithm illuminates significant decrease in the time
complexity cost of RG generation. The proposed technique is
applicable to parallel programs having multiple threads with the
synchronization problem.
Abstract: This study focuses on emission of black carbon (BC)
from field open burning of corn residues. Real-time BC
concentration was measured by Micro Aethalometer from field
burning and simulated open burning in a chamber (SOC)
experiments. The average concentration of BC was 1.18±0.47 mg/m3
in the field and 0.89±0.63 mg/m3 in the SOC. The deduced emission
factor from field experiments was 0.50±0.20 gBC/kgdm, and 0.56±0.33
gBC/kgdm from SOC experiment, which are in good agreement with
other studies. In 2007, the total burned area of corn crop was 8,000
ha, resulting in an emission load of BC 20 ton corresponding to 44.5
million kg CO2 equivalent. Therefore, the control of open burning in
corn field represents a significant global warming reduction option.
Abstract: Fingerprint based identification system; one of a well
known biometric system in the area of pattern recognition and has
always been under study through its important role in forensic
science that could help government criminal justice community. In
this paper, we proposed an identification framework of individuals by
means of fingerprint. Different from the most conventional
fingerprint identification frameworks the extracted Geometrical
element features (GEFs) will go through a Discretization process.
The intention of Discretization in this study is to attain individual
unique features that could reflect the individual varianceness in order
to discriminate one person from another. Previously, Discretization
has been shown a particularly efficient identification on English
handwriting with accuracy of 99.9% and on discrimination of twins-
handwriting with accuracy of 98%. Due to its high discriminative
power, this method is adopted into this framework as an independent
based method to seek for the accuracy of fingerprint identification.
Finally the experimental result shows that the accuracy rate of
identification of the proposed system using Discretization is 100%
for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is
much better than the conventional or the existing fingerprint
identification system (72% for FVC2000, 26% for FVC2002 and
32.8% for FVC2004). The result indicates that Discretization
approach manages to boost up the classification effectively, and
therefore prove to be suitable for other biometric features besides
handwriting and fingerprint.