Abstract: Embedded hardware simulator is a valuable computeraided
tool for embedded application development. This paper focuses
on the ARM926EJ-S MMU, builds state transition models and
formally verifies critical properties for the models. The state transition
models include loading instruction model, reading data model, and
writing data model. The properties of the models are described by
CTL specification language, and they are verified in VIS. The results
obtained in VIS demonstrate that the critical properties of MMU are
satisfied in the state transition models. The correct models can be
used to implement the MMU component in our simulator. In the
end of this paper, the experimental results show that the MMU can
successfully accomplish memory access requests from CPU.
Abstract: A analysis on the conventional the blood pressure estimation method using an oscillometric sphygmomanometer was
performed through a computer simulation using an arterial pressure-volume (APV) model. Traditionally, the maximum amplitude algorithm (MAP) was applied on the oscillation waveforms of the APV model to obtain the mean arterial pressure and the characteristic ratio. The estimation of mean arterial pressure and
characteristic ratio was significantly affected with the shape of the blood pressure waveforms and the cutoff frequency of high-pass filter
(HPL) circuitry. Experimental errors are due to these effects when estimating blood pressure. To find out an algorithm independent from
the influence of waveform shapes and parameters of HPL, the volume
oscillation of the APV model and the phase shift of the oscillation with fast fourier transform (FFT) were testified while increasing the cuff
pressure from 1 mmHg to 200 mmHg (1 mmHg per second). The phase shift between the ranges of volume oscillation was then only observed between the systolic and the diastolic blood pressures. The same results were also obtained from the simulations performed on two different the arterial blood pressure waveforms and one
hyperthermia waveform.
Abstract: This work was to study batch biosorption of Pb(II)
ions from aqueous solution by Luffa charcoal. The effect of operating
parameters such as adsorption contact time, initial pH solution and
different initial Pb(II) concentration on the sorption of Pb(II) were
investigated. The results showed that the adsorption of Pb(II) ions
was initially rapid and the equilibrium time was 10 h. Adsorption
kinetics of Pb(II) ions onto Luffa charcoal could be best described by
the pseudo-second order model. At pH 5.0 was favorable for the
adsorption and removal of Pb(II) ions. Freundlich adsorption
isotherm model was better fitted for the adsorption of Pb(II) ions than
Langmuir and Timkin isotherms, respectively. The highest monolayer
adsorption capacity obtained from Langmuir isotherm model was
51.02 mg/g. This study demonstrated that Luffa charcoal could be
used for the removal of Pb(II) ions in water treatment.
Abstract: This paper reports the feasibility of the ARMA model
to describe a bursty video source transmitting over a AAL5 ATM link
(VBR traffic). The traffic represents the activity of the action movie
"Lethal Weapon 3" transmitted over the ATM network using the Fore
System AVA-200 ATM video codec with a peak rate of 100 Mbps
and a frame rate of 25. The model parameters were estimated for a
single video source and independently multiplexed video sources. It
was found that the model ARMA (2, 4) is well-suited for the real data
in terms of average rate traffic profile, probability density function,
autocorrelation function, burstiness measure, and the pole-zero
distribution of the filter model.
Abstract: This paper presents the combination of different precipitation data sets and the distributed hydrological model, in order to examine the flood runoff reproductivity of scattered observation catchments. The precipitation data sets were obtained from observation using rain-gages, satellite based estimate (TRMM), and numerical weather prediction model (NWP), then were coupled with the super tank model. The case study was conducted in three basins (small, medium, and large size) located in Central Vietnam. Calculated hydrographs based on ground observation rainfall showed best fit to measured stream flow, while those obtained from TRMM and NWP showed high uncertainty of peak discharges. However, calculated hydrographs using the adjusted rainfield depicted a promising alternative for the application of TRMM and NWP in flood modeling for scattered observation catchments, especially for the extension of forecast lead time.
Abstract: A mathematical model for determining the overall efficiency
of a multistage tractor gearbox including all gear, lubricant,
surface finish related parameters and operating conditions is
presented. Sliding friction, rolling friction and windage losses were
considered as the main sources of power loss in the gearing system. A
computer code in FORTRAN was developed to simulate the model.
Sliding friction contributes about 98% of the total power loss for
gear trains operating at relatively low speeds (less than 2000 rpm
input speed). Rolling frictional losses decrease with increased load
while windage losses are only significant for gears running at very
high speeds (greater than 3000 rpm). The results also showed that the
overall efficiency varies over the path of contact of the gear meshes
ranging between 94% to 99.5%.
Abstract: This paper investigates the indices of a creative city in
Isfahan. Its main aim is to evaluate quantitative status of the creative
city indices in Isfahan city, analyze the dispersion and distribution of
these indices in Isfahan city. Concerning these, this study tries to
analyze the creative city indices in fifteen area of Isfahan through
secondary data, questionnaire, TOPSIS model, Shannon entropy and
SPSS. Based on this, the fifteen areas of Isfahan city have been
ranked with 12 factors of creative city indices. The results of studies
show that fifteen areas of Isfahan city are not equally benefiting from
creative indices and there is much difference between the areas of
Isfahan city.
Abstract: This paper presents a physics-based model for the
high-voltage fast recovery diodes. The model provides a good
trade-off between reverse recovery time and forward voltage drop
realized through a combination of lifetime control and emitter
efficiency reduction techniques. The minority carrier lifetime can be
extracted from the reverse recovery transient response and forward
characteristics. This paper also shows that decreasing the amount of
the excess carriers stored in the drift region will result in softer
characteristics which can be achieved using a lower doping level. The
developed model is verified by experiment and the measurement data
agrees well with the model.
Abstract: Professional development is the focus of this study. It
reports on questionnaire data that examined the perceived
effectiveness of the Train the Trainer model of technology
professional development for elementary teachers. Eighty-three
selected teachers called Information Technology Coaches received
four half-day and one after-school in-service sessions. Subsequently,
coaches shared the information and skills acquired during training
with colleagues. Results indicated that participants felt comfortable
as Information Technology Coaches and felt well prepared because
of their technological professional development. Overall, participants
perceived the Train the Trainer model to be effective. The outcomes
of this study suggest that the use of the Train the Trainer model, a
known professional development model, can be an integral and
interdependent component of the newer more comprehensive
learning community professional development model.
Abstract: Linear induction motors are used in various industries
but they have some specific phenomena which are the causes for
some problems. The most important phenomenon is called end effect.
End effect decreases efficiency, power factor and output force and
unbalances the phase currents. This phenomenon is more important
in medium and high speeds machines. In this paper a factor, EEF , is
obtained by an accurate equivalent circuit model, to determine the
end effect intensity. In this way, all of effective design parameters on
end effect is described. Accuracy of this equivalent circuit model is
evaluated by two dimensional finite-element analysis using ANSYS.
The results show the accuracy of the equivalent circuit model.
Abstract: In a metal forming process, the friction between the
material and the tools influences the process by modifying the stress
distribution of the workpiece. This frictional behaviour is often taken
into account by using a constant coefficient of friction in the finite
element simulations of sheet metal forming processes. However,
friction coefficient varies in time and space with many parameters.
The Stribeck friction model is investigated in this study to predict
springback behaviour of AA6061-T4 sheets during V-bending
process. The coefficient of friction in Stribeck curve depends on
sliding velocity and contact pressure. The plane-strain bending
process is simulated in ABAQUS/Standard. We compared the
computed punch load-stroke curves and springback related to the
constant coefficient of friction with the defined friction model. The
results clearly showed that the new friction model provides better
agreement between experiments and results of numerical simulations.
The influence of friction models on stress distribution in the
workpiece is also studied numerically
Abstract: Smoke discharging is a main reason of air pollution
problem from industrial plants. The obstacle of a building has an
affect with the air pollutant discharge. In this research, a mathematical
model of the smoke dispersion from two sources and one source with
a structural obstacle is considered. The governing equation of the
model is an isothermal mass transfer model in a viscous fluid. The
finite element method is used to approximate the solutions of the
model. The triangular linear elements have been used for discretising
the domain, and time integration has been carried out by semi-implicit
finite difference method. The simulations of smoke dispersion in
cases of one chimney and two chimneys are presented. The maximum
calculated smoke concentration of both cases are compared. It is then
used to make the decision for smoke discharging and air pollutant
control problems on industrial area.
Abstract: Run-offs are considered as important hydrological factors in feasibility studies of river engineering and irrigation-related projects under arid and semi-arid condition. Flood control is one of the crucial factor, the management of which while mitigates its destructive consequences, abstracts considerable volume of renewable water resources. The methodology applied here was based on Mizumura, which applied a mathematical model for simple tank to simulate the rainfall-run-off process in a particular water basin using the data from the observational hydrograph. The model was applied in the Dez River water basin adjacent to Greater Dezful region, Iran in order to simulate and estimate the floods. Results indicated that the calculated hydrographs using the simple tank method, SCS-CN model and the observation hydrographs had a close proximity. It was also found that on average the flood time and discharge peaks in the simple tank were closer to the observational data than the CN method. On the other hand, the calculated flood volume in the CN model was significantly closer to the observational data than the simple tank model.
Abstract: The adaptive backstepping controller for inverted pendulum is designed by using the general motion control model. Backstepping is a novel nonlinear control technique based on the Lyapunov design approach, used when higher derivatives of parameter estimation appear. For easy parameter adaptation, the mathematical model of the inverted pendulum converted into the motion control model. This conversion is performed by taking functions of unknown parameters and dynamics of the system. By using motion control model equations, inverted pendulum is simulated without any information about not only parameters but also measurable dynamics. Also these results are compare with the adaptive backstepping controller which extended with integral action that given from [1].
Abstract: We depend upon explanation in order to “make sense"
out of our world. And, making sense is all the more important when
dealing with change. But, what happens if our explanations are
wrong? This question is examined with respect to two types of
explanatory model. Models based on labels and categories we shall
refer to as “representations." More complex models involving
stories, multiple algorithms, rules of thumb, questions, ambiguity we
shall refer to as “compressions." Both compressions and
representations are reductions. But representations are far more
reductive than compressions. Representations can be treated as a set
of defined meanings – coherence with regard to a representation is
the degree of fidelity between the item in question and the definition
of the representation, of the label. By contrast, compressions contain
enough degrees of freedom and ambiguity to allow us to make
internal predictions so that we may determine our potential actions in
the possibility space. Compressions are explanatory via mechanism.
Representations are explanatory via category. Managers are often
confusing their evocation of a representation (category inclusion) as
the creation of a context of compression (description of mechanism).
When this type of explanatory error occurs, more errors follow. In
the drive for efficiency such substitutions are all too often proclaimed
– at the manager-s peril..
Abstract: This study analyzes characteristics determining
member’s willingness to invest in cooperatives using ordered logit
model. The data were collected in a field survey among 122
cooperative members in north-central China. The descriptive analysis
of survey evidence suggests that cooperatives in China generally
having poor ability to deliver the processing services related to
product package, grading, and storage, performing worse in
profitability, inability of providing returns to capital and obtaining
agricultural loan. The regression results demonstrate that members’
farm size, their satisfaction with cooperative price preferential
services, attitudes toward cooperative operational scale and
development potential have statistically significant impact on
willingness to invest.
Abstract: This paper describes the development of a numerical finite element algorithm used for the analysis of reinforced concrete structure equipped with shakes energy absorbing device subjected to earthquake excitation. For this purpose a finite element program code for analysis of reinforced concrete frame buildings is developed. The performance of developed program code is evaluated by analyzing of a reinforced concrete frame buildings model. The results are show that using damper device as seismic energy dissipation system effectively can reduce the structural response of framed structure during earthquake occurrence.
Abstract: Fault-proneness of a software module is the
probability that the module contains faults. To predict faultproneness
of modules different techniques have been proposed which
includes statistical methods, machine learning techniques, neural
network techniques and clustering techniques. The aim of proposed
study is to explore whether metrics available in the early lifecycle
(i.e. requirement metrics), metrics available in the late lifecycle (i.e.
code metrics) and metrics available in the early lifecycle (i.e.
requirement metrics) combined with metrics available in the late
lifecycle (i.e. code metrics) can be used to identify fault prone
modules using Genetic Algorithm technique. This approach has been
tested with real time defect C Programming language datasets of
NASA software projects. The results show that the fusion of
requirement and code metric is the best prediction model for
detecting the faults as compared with commonly used code based
model.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques for extracting
phrases and stemming words. Then an ontology based conceptual
method will be used to annotate documents and expand the query.
To expand a query the spread activation algorithm is improved so
that the expansion can be done in various aspects. The annotated
documents and the expanded query will be processed to compute
the relevance degree exploiting statistical methods. The outstanding
features of our approach are (1) combining conceptual, statistical
and linguistic features of documents, (2) expanding the query with
its related concepts before comparing to documents, (3) extracting
and using both words and phrases to compute relevance degree, (4)
improving the spread activation algorithm to do the expansion based
on weighted combination of different conceptual relationships and
(5) allowing variable document vector dimensions. A ranking
system called ORank is developed to implement and test the
proposed model. The test results will be included at the end of the
paper.
Abstract: In this communication a quantitative modeling
approach is applied to construct model for the exchange of gases
from open sewer channel to the atmosphere. The data for the
exchange of gases of the open sewer channel for the year January
1979 to December 2006 is utilized for the construction of the model.
The study reveals that stream flow of the open sewer channel
exchanges the toxic gases continuously with time varying scale. We
find that the quantitative modeling approach is more parsimonious
model for these exchanges. The usual diagnostic tests are applied for
the model adequacy. This model is beneficial for planner and
managerial bodies for the improvement of implemented policies to
overcome future environmental problems.