Abstract: This research was conducted to determine responses
of chickpeas to drought in different periods (early period, late period,
no-irrigation, two times irrigation as control). The trial was made in
“Randomized Complete Block Design" with three replications on
2010 and 2011 years in Konya-Turkey. Genotypes were consisted
from 7 lines of ICARDA, 2 certified lines and 1 local population. The
results showed that; as means of years and genotypes, early period
stress showed highest (207.47 kg da-1) seed yield and it was followed
by control (202.33 kg da-1), late period (144.64 kg da-1) and normal
(106.93 kg da-1) stress applications. The genotypes were affected too
much by drought and, the lowest seed was taken from non-irrigated
plots. As the means of years and stress applications, the highest
(196.01 kg da-1) yield was taken from genotype 22255. The reason of
yield variation could be derived from different responses of
genotypes to drought.
Abstract: This paper presents an experimental case using sensory thermography to describe temperatures behavior on median nerve once an activity of repetitive motion was done. Thermography is a noninvasive technique without biological hazard and not harm at all times and has been applied in many experiments to seek for temperature patterns that help to understand diseases like cancer and cumulative trauma disorders (CTD’s). An infrared sensory thermography technology was developed to execute this study. Three women in good shape were selected for the repetitive motion tests for 4 days, two right-handed women and 1 left handed woman, two sensory thermographers were put on both median nerve wrists to get measures. The evaluation time was of 3 hours 30 minutes in a controlled temperature, 20 minutes of stabilization time at the beginning and end of the operation. Temperatures distributions are statistically evaluated and showed similar temperature patterns behavior.
Abstract: This paper introduces a technique for simulating a
single-server exponential queuing system. The technique called the
Q-Simulator is a computer program which can simulate the effect of
traffic intensity on all system average quantities given the arrival
and/or service rates. The Q-Simulator has three phases namely: the
formula based method, the uncontrolled simulation, and the
controlled simulation. The Q-Simulator generates graphs (crystal
solutions) for all results of the simulation or calculation and can be
used to estimate desirable average quantities such as waiting times,
queue lengths, etc.
Abstract: We propose a new fiber lens structure for large distance
measurement in which a polymer layer is added to a conventional
fiber lens. The proposed fiber lens can adjust the working distance by
properly choosing the refractive index and thickness of the polymer
layer. In our numerical analysis for the fiber lens radius of 120 μm,
the working distance of the proposed fiber lens is about 10 mm
which is about 30 times larger than conventional fiber lens.
Abstract: This paper presents parametric probability density
models for call holding times (CHTs) into emergency call center
based on the actual data collected for over a week in the public
Emergency Information Network (EIN) in Mongolia. When the set of
chosen candidates of Gamma distribution family is fitted to the call
holding time data, it is observed that the whole area in the CHT
empirical histogram is underestimated due to spikes of higher
probability and long tails of lower probability in the histogram.
Therefore, we provide the Gaussian parametric model of a mixture of
lognormal distributions with explicit analytical expressions for the
modeling of CHTs of PSNs. Finally, we show that the CHTs for
PSNs are fitted reasonably by a mixture of lognormal distributions
via the simulation of expectation maximization algorithm. This result
is significant as it expresses a useful mathematical tool in an explicit
manner of a mixture of lognormal distributions.
Abstract: The effect of different tempering temperatures and heat treatment times on the corrosion resistance of austenitic stainless steels in oxalic acid was studied in this work using conventional weight loss and electrochemical measurements. Typical 304 and 316 stainless steel samples were tempered at 150oC, 250oC and 350oC after being austenized at 1050oC for 10 minutes. These samples were then immersed in 1.0M oxalic acid and their weight losses were measured at every five days for 30 days. The results show that corrosion of both types of ASS samples increased with an increase in tempering temperature and time and this was due to the precipitation of chromium carbides at the grain boundaries of these metals. Electrochemical results also confirm that the 304 ASS is more susceptible to corrosion than 316 ASS in this medium. This is attributed to the molybdenum in the composition of the latter. The metallographic images of these samples showed non–uniform distribution of precipitated chromium carbides at the grain boundaries of these metals and unevenly distributed carbides and retained austenite phases which cause galvanic effects in the medium.
Abstract: Existing experiences indicate that one of the most
prominent reasons that some ERP implementations fail is related to
selecting an improper ERP package. Among those important factors
resulting in inappropriate ERP selections, one is to ignore preliminary
activities that should be done before the evaluation of ERP packages.
Another factor yielding these unsuitable selections is that usually
organizations employ prolonged and costly selection processes in
such extent that sometimes the process would never be finalized
or sometimes the evaluation team might perform many key final
activities in an incomplete or inaccurate way due to exhaustion, lack
of interest or out-of-date data. In this paper, a systematic approach
that recommends some activities to be done before and after the
main selection phase is introduced for choosing an ERP package. On
the other hand, the proposed approach has utilized some ideas that
accelerates the selection process at the same time that reduces the
probability of an erroneous final selection.
Abstract: Existing work in temporal logic on representing the
execution of infinitely many transactions, uses linear-time temporal
logic (LTL) and only models two-step transactions. In this paper,
we use the comparatively efficient branching-time computational tree
logic CTL and extend the transaction model to a class of multistep
transactions, by introducing distinguished propositional variables
to represent the read and write steps of n multi-step transactions
accessing m data items infinitely many times. We prove that the
well known correspondence between acyclicity of conflict graphs
and serializability for finite schedules, extends to infinite schedules.
Furthermore, in the case of transactions accessing the same set of
data items in (possibly) different orders, serializability corresponds
to the absence of cycles of length two. This result is used to give an
efficient encoding of the serializability condition into CTL.
Abstract: Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.
Abstract: The traditional method for essential oil extraction from agarwood (Aquilaria Crassna) is to soak it in water and follow with hydrodistillation. The effect of various agarwood pretreatments: ethanol, acid, alkaline, enzymes, and ultrasound, and the effect of subcritical water extraction(SWE) was studied to compare with the traditional method. The major compositions of agarwood oil from hydrodistillation were aroma compounds as follow: aristol-9-en-8- one (21.53%), selina-3, 7(11)-diene (12.96%), τ-himachalene (9.28%), β-guaiene (5.79%), hexadecanoic acid (4.90%) and guaia- 3,9-diene (4.21%). Whereas agarwood oil from pretreatments with ethanol and ultrasound, and SWE got fatty acid compounds. Extraction of agarwood oil using these pretreatments could improve the agarwood oil yields up to 2 times that of the traditional method. The components of the pretreated sample with diluted acid (H2SO4) at pH 4 gave quite similar results as the traditional method. Therefore, the enhancement of essential oil from agarwood depends on requirement of type of extracted oil that involved extraction methods.
Abstract: In this paper a new method for increasing the speed of
SAGCM-APD is proposed. Utilizing carrier rate equations in
different regions of the structure, a circuit model for the structure is
obtained. In this research, in addition to frequency response, the
effect of added new charge layer on some transient parameters like
slew-rate, rising and falling times have been considered. Finally, by
trading-off among some physical parameters such as different layers
widths and droppings, a noticeable decrease in breakdown voltage
has been achieved. The results of simulation, illustrate some features
of proposed structure improvement in comparison with conventional
SAGCM-APD structures.
Abstract: The Bangalore City is facing the acute problem of
pollution in the atmosphere due to the heavy increase in the traffic
and developmental activities in recent years. The present study is an
attempt in the direction to assess trend of the ambient air quality
status of three stations, viz., AMCO Batteries Factory, Mysore Road,
GRAPHITE INDIA FACTORY, KHB Industrial Area, Whitefield
and Ananda Rao Circle, Gandhinagar with respect to some of the
major criteria pollutants such as Total Suspended particular matter
(SPM), Oxides of nitrogen (NOx), and Oxides of sulphur (SO2). The
sites are representative of various kinds of growths viz., commercial,
residential and industrial, prevailing in Bangalore, which are
contributing to air pollution. The concentration of Sulphur Dioxide
(SO2) at all locations showed a falling trend due to use of refined
petrol and diesel in the recent years. The concentration of Oxides of
nitrogen (NOx) showed an increasing trend but was within the
permissible limits. The concentration of the Suspended particular
matter (SPM) showed the mixed trend. The correlation between
model and observed values is found to vary from 0.4 to 0.7 for SO2,
0.45 to 0.65 for NOx and 0.4 to 0.6 for SPM. About 80% of data is
observed to fall within the error band of ±50%. Forecast test for the
best fit models showed the same trend as actual values in most of the
cases. However, the deviation observed in few cases could be
attributed to change in quality of petro products, increase in the
volume of traffic, introduction of LPG as fuel in many types of
automobiles, poor condition of roads, prevailing meteorological
conditions, etc.
Abstract: The paper considers a single-server queue with fixedsize
batch Poisson arrivals and exponential service times, a model
that is useful for a buffer that accepts messages arriving as fixed size
batches of packets and releases them one packet at time. Transient
performance measures for queues have long been recognized as
being complementary to the steady-state analysis. The focus of the
paper is on the use of the functions that arise in the analysis of the
transient behaviour of the queuing system. The paper exploits
practical modelling to obtain a solution to the integral equation
encountered in the analysis. Results obtained indicate that under
heavy load conditions, there is significant disparity in the statistics
between the transient and steady state values.
Abstract: The goal of this project is to design a system to
recognition voice commands. Most of voice recognition systems
contain two main modules as follow “feature extraction" and “feature
matching". In this project, MFCC algorithm is used to simulate
feature extraction module. Using this algorithm, the cepstral
coefficients are calculated on mel frequency scale. VQ (vector
quantization) method will be used for reduction of amount of data to
decrease computation time. In the feature matching stage Euclidean
distance is applied as similarity criterion. Because of high accuracy
of used algorithms, the accuracy of this voice command system is
high. Using these algorithms, by at least 5 times repetition for each
command, in a single training session, and then twice in each testing
session zero error rate in recognition of commands is achieved.
Abstract: The main aim of this research is to investigate a novel technique for implementing a more natural and intelligent conversation system. Conversation systems are designed to converse like a human as much as their intelligent allows. Sometimes, we can think that they are the embodiment of Turing-s vision. It usually to return a predetermined answer in a predetermined order, but conversations abound with uncertainties of various kinds. This research will focus on an integrated natural language processing approach. This approach includes an integrated knowledge-base construction module, a conversation understanding and generator module, and a state manager module. We discuss effectiveness of this approach based on an experiment.
Abstract: Modeling of a heterogeneous industrial fixed bed
reactor for selective dehydrogenation of heavy paraffin with Pt-Sn-
Al2O3 catalyst has been the subject of current study. By applying
mass balance, momentum balance for appropriate element of reactor
and using pressure drop, rate and deactivation equations, a detailed
model of the reactor has been obtained. Mass balance equations have
been written for five different components. In order to estimate
reactor production by the passage of time, the reactor model which is
a set of partial differential equations, ordinary differential equations
and algebraic equations has been solved numerically.
Paraffins, olefins, dienes, aromatics and hydrogen mole percent as
a function of time and reactor radius have been found by numerical
solution of the model. Results of model have been compared with
industrial reactor data at different operation times. The comparison
successfully confirms validity of proposed model.
Abstract: Supply chain networks are frequently hit by
unplanned events which lead to disruptions and cause operational and
financial consequences. It is neither possible to avoid disruption risk
entirely, nor are network members able to prepare for every possible
disruptive event. Therefore a continuity planning should be set up
which supports effective operational responses in supply chain
networks in times of emergencies. In this research network related
degrees of freedom which determine the options for responsive
actions are derived from interview data. The findings are further
embedded into a common risk management process. The paper
provides support for researchers and practitioners to identify the
network related options for responsive actions and to determine the
need for improving the reaction capabilities.
Abstract: In an assessment of the extractability of metals in
green liquor dregs from the chemical recovery circuit of semichemical
pulp mill, extractable concentrations of heavy metals in
artificial gastric fluid were between 10 (Ni) and 717 (Zn) times
higher than those in artificial sweat fluid. Only Al (6.7 mg/kg; d.w.),
Ni (1.2 mg/kg; d.w.) and Zn (1.8 mg/kg; d.w.) showed extractability
in the artificial sweat fluid, whereas Al (730 mg/kg; d.w.), Ba (770
mg/kg; d.w.) and Zn (1290 mg/kg; d.w.) showed clear extractability
in the artificial gastric fluid. As certain heavy metals were clearly
soluble in the artificial gastric fluid, the careful handling of this
residue is recommended in order to prevent the penetration of green
liquor dregs across the human gastrointestinal tract.
Abstract: The purpose of research was to know the role of
immunogenic protein of 49 kDa from V.alginolyticus which capable
to initiate molecule expression of MHC Class II in receptor of
Cromileptes altivelis. The method used was in vivo experimental
research through testing of immunogenic protein 49 kDa from
V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams)
using 3 times booster by injecting an immunogenic protein in a
intramuscular manner. Response of expressed MHC molecule was
shown using immunocytochemistry method and SEM. Results
indicated that adhesin V.alginolyticus 49 kDa which have
immunogenic character could trigger expression of MHC class II on
receptor of grouper and has been proven by staining using
immunocytochemistry and SEM with labeling using antibody anti
MHC (anti mouse). This visible expression based on binding between
epitopes antigen and antibody anti MHC in the receptor. Using
immunocytochemistry, intracellular response of MHC to in vivo
induction of immunogenic adhesin from V.alginolyticus was shown.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.