Abstract: Droughts are complex, natural hazards that, to a
varying degree, affect some parts of the world every year. The range
of drought impacts is related to drought occurring in different stages
of the hydrological cycle and usually different types of droughts,
such as meteorological, agricultural, hydrological, and socioeconomical
are distinguished. Streamflow drought was analyzed by
the method of truncation level (at 70% level) on daily discharges
measured in 54 hydrometric stations in southwestern Iran. Frequency
analysis was carried out for annual maximum series (AMS) of
drought deficit volume and duration series. Some factors including
physiographic, climatic, geologic, and vegetation cover were studied
as influential factors in the regional analysis. According to the results
of factor analysis, six most effective factors were identified as area,
rainfall from December to February, the percent of area with
Normalized Difference Vegetation Index (NDVI)
Abstract: Four design alternatives for lateral force-resisting
systems of tall buildings in Dubai, UAE are presented. Quantitative
comparisons between the different designs are also made. This paper
is intended to provide different feasible lateral systems to be used in
Dubai in light of the available seismic hazard studies of the UAE.
The different lateral systems are chosen in conformance with the
International Building Code (IBC). Moreover, the expected behavior
of each system is highlighted and light is shed on some of the cost
implications associated with lateral system selection.
Abstract: Software projects are very dynamic and require
recurring adjustments of their project plans. These settings can be
understood as reconfigurations in the schedule, in the resources
allocation and other design elements. Yet, during the planning and
execution of a software project, the integration of specific activities
in the projects with the activities that take part in the organization-s
common activity flow should be considered. This article presents the
results from a systematic review of aspects related to software
projects- dynamic reconfiguration emphasizing the integration of
project management with the organizational flows. A series of studies
was analyzed from the year 2000 to the present. The results of this
work show that there is a diversity of techniques and strategies for
dynamic reconfiguration of software projects-. However, few
approaches consider the integration of software project activities with
the activities that take part in the organization-s common workflow.
Abstract: The Korean government has applied preliminary feasibility study for new and huge R&D programs since 2008.The study is carried out from the viewpoints of technology, policy, and Economics. Then integrate the separate analysis and finally arrive at a definite result; whether a program is feasible or unfeasible, This paper describes the concept and method of the feasibility analysis focused on technological viability assessment for technical analysis. It consists of technology trend assessment and technology level assessment. Through the analysis, we can determine the chance of schedule delay or cost overrun occurring in the proposed plan.
Abstract: Urban road network traffic has become one of the
most studied research topics in the last decades. This is mainly due to
the enlargement of the cities and the growing number of motor
vehicles traveling in this road network. One of the most sensitive
problems is to verify if the network is congestion-free. Another
related problem is the automatic reconfiguration of the network
without building new roads to alleviate congestions. These problems
require an accurate model of the traffic to determine the steady state
of the system. An alternative is to simulate the traffic to see if there
are congestions and when and where they occur. One key issue is to
find an adequate model for road intersections. Once the model
established, either a large scale model is built or the intersection is
represented by its performance measures and simulation for analysis.
In both cases, it is important to seek the queueing model to represent
the road intersection. In this paper, we propose to model the road
intersection as a BCMP queueing network and we compare this
analytical model against a simulation model for validation.
Abstract: A novel typical day prediction model have been built and validated by the measured data of a grid-connected solar photovoltaic (PV) system in Macau. Unlike conventional statistical method used by previous study on PV systems which get results by averaging nearby continuous points, the present typical day statistical method obtain the value at every minute in a typical day by averaging discontinuous points at the same minute in different days. This typical day statistical method based on discontinuous point averaging makes it possible for us to obtain the Gaussian shape dynamical distributions for solar irradiance and output power in a yearly or monthly typical day. Based on the yearly typical day statistical analysis results, the maximum possible accumulated output energy in a year with on site climate conditions and the corresponding optimal PV system running time are obtained. Periodic Gaussian shape prediction models for solar irradiance, output energy and system energy efficiency have been built and their coefficients have been determined based on the yearly, maximum and minimum monthly typical day Gaussian distribution parameters, which are obtained from iterations for minimum Root Mean Squared Deviation (RMSD). With the present model, the dynamical effects due to time difference in a day are kept and the day to day uncertainty due to weather changing are smoothed but still included. The periodic Gaussian shape correlations for solar irradiance, output power and system energy efficiency have been compared favorably with data of the PV system in Macau and proved to be an improvement than previous models.
Abstract: The peculiarities of the nanoscale structure-phase
states formed after electroexplosive carburizing and subsequent
electron-beam treatment of technically pure titanium surface in different regimes are established by methods of transmission electron
diffraction microscopy and physical mechanisms are discussed. Electroexplosive carburizing leads to surface layer formation
(40 m thickness) with increased (in 3.5 times) microhardness. It consists of β-titanium, graphite (monocrystals 100-150 nm,
polycrystals 5-10 nm, amorphous particles 3-5nm), TiC (5-10 nm), β-Ti02 (2-20nm). After electron-beam treatment additionally increasing the microhardness the surface layer consists of TiC.
Abstract: The purpose of this study was to develop and examine a
Teaching Commitment Scale of Health and Physical Education
(TCS-HPE) for Taiwanese elementary school teachers. First of all,
based on teaching commitment related theory and literatures to
develop a original scale with 40 items, later both stratified random
sampling and cluster sampling were used to sample participants.
During the first stage, 300 teachers were sampled and 251 valid scales
(83.7%) returned. Later, the data was analyzed by exploratory factor
analysis to obtain 74.30% of total variance for the construct validity.
The Cronbach-s alpha coefficient of sum scale reliability was 0.94, and
subscale coefficients were between 0.80 and 0.96. In the second stage,
400 teachers were sampled and 318 valid scales (79.5%) returned.
Finally, this study used confirmatory factor analysis to test validity and
reliability of TCS-HPE. The result showed that the fit indexes reached
acceptable criteria(¤ç2
(246 ) =557.64 , p
Abstract: Repetitive systems stand for a kind of systems that
perform a simple task on a fixed pattern repetitively, which are
widely spread in industrial fields. Hence, many researchers have been
interested in those systems, especially in the field of iterative learning
control (ILC). In this paper, we propose a finite-horizon tracking
control scheme for linear time-varying repetitive systems with uncertain
initial conditions. The scheme is derived both analytically
and numerically for state-feedback systems and only numerically for
output-feedback systems. Then, it is extended to stable systems with
input constraints. All numerical schemes are developed in the forms
of linear matrix inequalities (LMIs). A distinguished feature of the
proposed scheme from the existing iterative learning control is that
the scheme guarantees the tracking performance exactly even under
uncertain initial conditions. The simulation results demonstrate the
good performance of the proposed scheme.
Abstract: An empirical study of web applications that use
software frameworks is presented here. The analysis is based on two
approaches. In the first, developers using such frameworks are
required, based on their experience, to assign weights to parameters
such as database connection. In the second approach, a performance
testing tool, OpenSTA, is used to compute start time and other such
measures. From such an analysis, it is concluded that open source
software is superior to proprietary software. The motivation behind
this research is to examine ways in which a quantitative assessment
can be made of software in general and frameworks in particular.
Concepts such as metrics and architectural styles are discussed along
with previously published research.
Abstract: Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Abstract: In this paper, we propose a single sample path based
algorithm with state aggregation to optimize the average rewards of
singularly perturbed Markov reward processes (SPMRPs) with a
large scale state spaces. It is assumed that such a reward process
depend on a set of parameters. Differing from the other kinds of
Markov chain, SPMRPs have their own hierarchical structure. Based
on this special structure, our algorithm can alleviate the load in the
optimization for performance. Moreover, our method can be applied
on line because of its evolution with the sample path simulated.
Compared with the original algorithm applied on these problems of
general MRPs, a new gradient formula for average reward
performance metric in SPMRPs is brought in, which will be proved
in Appendix, and then based on these gradients, the schedule of the
iteration algorithm is presented, which is based on a single sample
path, and eventually a special case in which parameters only
dominate the disturbance matrices will be analyzed, and a precise
comparison with be displayed between our algorithm with the old
ones which is aim to solve these problems in general Markov reward
processes. When applied in SPMRPs, our method will approach a fast
pace in these cases. Furthermore, to illustrate the practical value of
SPMRPs, a simple example in multiple programming in computer
systems will be listed and simulated. Corresponding to some practical
model, physical meanings of SPMRPs in networks of queues will be
clarified.
Abstract: Plasmodium vivax malaria differs from P. falciparum malaria in that a person suffering from P. vivax infection can suffer relapses of the disease. This is due the parasite being able to remain dormant in the liver of the patients where it is able to re-infect the patient after a passage of time. During this stage, the patient is classified as being in the dormant class. The model to describe the transmission of P. vivax malaria consists of a human population divided into four classes, the susceptible, the infected, the dormant and the recovered. The effect of a time delay on the transmission of this disease is studied. The time delay is the period in which the P. vivax parasite develops inside the mosquito (vector) before the vector becomes infectious (i.e., pass on the infection). We analyze our model by using standard dynamic modeling method. Two stable equilibrium states, a disease free state E0 and an endemic state E1, are found to be possible. It is found that the E0 state is stable when a newly defined basic reproduction number G is less than one. If G is greater than one the endemic state E1 is stable. The conditions for the endemic equilibrium state E1 to be a stable spiral node are established. For realistic values of the parameters in the model, it is found that solutions in phase space are trajectories spiraling into the endemic state. It is shown that the limit cycle and chaotic behaviors can only be achieved with unrealistic parameter values.
Abstract: In this work a surgical simulator is produced which
enables a training otologist to conduct a virtual, real-time prosthetic
insertion. The simulator provides the Ear, Nose and Throat surgeon
with real-time visual and haptic responses during virtual cochlear
implantation into a 3D model of the human Scala Tympani (ST). The
parametric model is derived from measured data as published in the
literature and accounts for human morphological variance, such as
differences in cochlear shape, enabling patient-specific pre- operative
assessment. Haptic modeling techniques use real physical data and
insertion force measurements, to develop a force model which
mimics the physical behavior of an implant as it collides with the ST
walls during an insertion. Output force profiles are acquired from the
insertion studies conducted in the work, to validate the haptic model.
The simulator provides the user with real-time, quantitative insertion
force information and associated electrode position as user inserts the
virtual implant into the ST model. The information provided by this
study may also be of use to implant manufacturers for design
enhancements as well as for training specialists in optimal force
administration, using the simulator. The paper reports on the methods
for anatomical modeling and haptic algorithm development, with
focus on simulator design, development, optimization and validation.
The techniques may be transferrable to other medical applications
that involve prosthetic device insertions where user vision is
obstructed.
Abstract: The oil and gas industry has moved towards Load and
Resistance Factor Design through API RP2A - LRFD and the
recently published international standard, ISO-19902, for design of
fixed steel offshore structures. The ISO 19902 is intended to provide
a harmonized design practice that offers a balanced structural fitness
for the purpose, economy and safety. As part of an ongoing work, the
reliability analysis of tubular joints of the jacket structure has been
carried out to calibrate the load and resistance factors for the design
of offshore platforms in Malaysia, as proposed in the ISO.
Probabilistic models have been established for the load effects (wave,
wind and current) and the tubular joints strengths. In this study the
First Order Reliability Method (FORM), coded in MATLAB
Software has been employed to evaluate the reliability index of the
typical joints, designed using API RP2A - WSD and ISO 19902.
Abstract: Dr Eliyahu Goldratt has done the pioneering work in
the development of Theory of Constraints. Since then, many more
researchers around the globe are working to enhance this body of
knowledge. In this paper, an attempt has been made to compile the
salient features of this theory from the work done by Goldratt and
other researchers. This paper will provide a good starting point to the
potential researchers interested to work in Theory of Constraints. The
paper will also help the practicing managers by clarifying their
concepts on the theory and will facilitate its successful
implementation in their working areas.
Abstract: In this paper, we propose a novel improvement for the generalized Lloyd Algorithm (GLA). Our algorithm makes use of an M-tree index built on the codebook which makes it possible to reduce the number of distance computations when the nearest code words are searched. Our method does not impose the use of any specific distance function, but works with any metric distance, making it more general than many other fast GLA variants. Finally, we present the positive results of our performance experiments.
Abstract: A satellite is being integrated and tested by BISEE (Beijing Institute of Spacecraft Environment Engineering). This paper describes the infrared lamp array simulation technology used for satellite thermal balance and thermal vacuum test. These tests were performed in KM6 space environmental simulator in Beijing, China. New software and hardware developed by BISEE, along with enhanced heat flux uniformity, provided for well accomplished thermal balance and thermal vacuum tests. The flux uniformity of lamp array was satisfied with test requirement. Monitored background radiometer offered reliable heat flux measurements with remarkable repeatability. Simulation software supplied accurate thermal flux distribution predictions.
Abstract: The present paper proposes high performance nonlinear
force controllers for a servopneumatic real-time fatigue test
machine. A CompactRIO® controller was used, being fully
programmed using LabVIEW language. Fuzzy logic control
algorithms were evaluated to tune the integral and derivative
components in the development of hybrid controllers, namely a FLC
P and a hybrid FLC PID real-time-based controllers. Their
behaviours were described by using state diagrams. The main
contribution is to ensure a smooth transition between control states,
avoiding discrete transitions in controller outputs. Steady-state errors
lower than 1.5 N were reached, without retuning the controllers.
Good results were also obtained for sinusoidal tracking tasks from
1/¤Ç to 8/¤Ç Hz.
Abstract: Glaucoma diagnosis involves extracting three features
of the fundus image; optic cup, optic disc and vernacular. Present
manual diagnosis is expensive, tedious and time consuming. A
number of researches have been conducted to automate this process.
However, the variability between the diagnostic capability of an
automated system and ophthalmologist has yet to be established. This
paper discusses the efficiency and variability between
ophthalmologist opinion and digital technique; threshold. The
efficiency and variability measures are based on image quality
grading; poor, satisfactory or good. The images are separated into
four channels; gray, red, green and blue. A scientific investigation
was conducted on three ophthalmologists who graded the images
based on the image quality. The images are threshold using multithresholding
and graded as done by the ophthalmologist. A
comparison of grade from the ophthalmologist and threshold is made.
The results show there is a small variability between result of
ophthalmologists and digital threshold.