Abstract: We present the development of a new underwater laser
cutting process in which a water-jet has been used along with the
laser beam to remove the molten material through kerf. The
conventional underwater laser cutting usually utilizes a high pressure
gas jet along with laser beam to create a dry condition in the cutting
zone and also to eject out the molten material. This causes a lot of gas
bubbles and turbulence in water, and produces aerosols and waste
gas. This may cause contamination in the surrounding atmosphere
while cutting radioactive components like burnt nuclear fuel. The
water-jet assisted underwater laser cutting process produces much
less turbulence and aerosols in the atmosphere. Some amount of
water vapor bubbles is formed at the laser-metal-water interface;
however, they tend to condense as they rise up through the
surrounding water. We present the design and development of a
water-jet assisted underwater laser cutting head and the parametric
study of the cutting of AISI 304 stainless steel sheets with a 2 kW
CW fiber laser. The cutting performance is similar to that of the gas
assist laser cutting; however, the process efficiency is reduced due to
heat convection by water-jet and laser beam scattering by vapor. This
process may be attractive for underwater cutting of nuclear reactor
components.
Abstract: Studies were carried out to determine the in vitro
susceptibility of the typhoid pathogens to combined action of Euphorbia hirta, Euphorbia heterophylla and Phyllanthus niruri. Clinical isolates of the typhoid bacilli were subjected to susceptibility testing using agar diffusion technique and the minimum inhibitory
concentration (MIC) determined with tube dilution technique. These
isolates, when challenged with doses of the extracts from the three
medicinal plants showed zones of inhibition as wide as 26±0.2mm, 22±0.1mm and 18±0.0mm respectively. The minimum inhibitory concentration (MIC) revealed organisms inhibited at varying
concentrations of extracts: E. hirta (S. typhi 0.250mg/ml, S. paratyphi A 0.125mg/ml, S. paratyphi B 0.185mg/ml and S. paratyphi C 0.225mg/ml), E. heterophylla (S. typhi 0.280mg/ml, S. paratyphi A
0.150mg/ml, S. paratyphi B 0.200mg/ml and S. paratyphi C 0.250mg/ml) and P. niruri (S. typhi 0.150mg/ml, S. paratyphi A 0.100mg/ml, S. paratyphi B 0.115mg/ml and S. paratyphi C 0.125mg/ml). The results of the synergy between the three plants in
the ration of 1:1:1 showed very low MICs for the test pathogens as follows S. typhi 0.025mg/ml, S. paratyphi A 0.080mg/ml, S. paratyphi B 0.015mg/ml and S. paratyphi C 0.10mg/ml with the
diameter zone of inhibition (DZI) ranging from 35±0.2mm,
28±0.4mm, 20±0.1mm and 32±0.3mm respectively. The secondary
metabolites were identified using simple methods and HPLC. Organic components such as anthroquinones, different alkaloids,
tannins, 6-ethoxy-1,2,3,4-tetrahydro-2,2,4-trimethyl and steroids were identified. The prevalence of Salmonellae, a deadly infectious disease, is still very high in parts of Nigeria. The synergistic action of these three plants is very high. It is concluded that pharmaceutical companies should take advantage of these findings to develop new
anti-typhoid drugs from these plants.
Abstract: In this paper we present semantic assistant agent
(SAA), an open source digital library agent which takes user query
for finding information in the digital library and takes resources-
metadata and stores it semantically. SAA uses Semantic Web to
improve browsing and searching for resources in digital library. All
metadata stored in the library are available in RDF format for
querying and processing by SemanSreach which is a part of SAA
architecture. The architecture includes a generic RDF-based model
that represents relationships among objects and their components.
Queries against these relationships are supported by an RDF triple
store.
Abstract: This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Abstract: Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.
Abstract: In order to evaluation the effects of soil organic
matter and biofertilizer on chickpea quality and biological
nitrogen fixation, field experiments were carried out in 2007
and 2008 growing seasons. In this research the effects of
different strategies for soil fertilization were investigated on
grain yield and yield component, minerals, organic compounds
and cooking time of chickpea. Experimental units were
arranged in split-split plots based on randomized complete
blocks with three replications. Main plots consisted of (G1):
establishing a mixed vegetation of Vicia panunica and
Hordeum vulgare and (G2): control, as green manure levels.
Also, five strategies for obtaining the base fertilizer
requirement including (N1): 20 t.ha-1 farmyard manure; (N2):
10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate;
(N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5):
10 t.ha-1 farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1
triple super phosphate were considered in sub plots.
Furthermoree four levels of biofertilizers consisted of (B1):
Bacillus lentus + Pseudomonas putida; (B2): Trichoderma
harzianum; (B3): Bacillus lentus + Pseudomonas putida +
Trichoderma harzianum; and (B4): control (without
biofertilizers) were arranged in sub-sub plots. Results showed
that integrating biofertilizers (B3) and green manure (G1)
produced the highest grain yield. The highest amounts of yield
were obtained in G1×N5 interaction. Comparison of all 2-way
and 3-way interactions showed that G1N5B3 was determined
as the superior treatment. Significant increasing of N, P2O5,
K2O, Fe and Mg content in leaves and grains emphasized on
superiority of mentioned treatment because each one of these
nutrients has an approved role in chlorophyll synthesis and
photosynthesis abilities of the crops. The combined application
of compost, farmyard manure and chemical phosphorus (N5)
in addition to having the highest yield, had the best grain
quality due to high protein, starch and total sugar contents, low
crude fiber and reduced cooking time.
Abstract: Evaluation and survey of curriculum quality as one of the most important components of universities system is necessary for different levels in higher education. The main purpose of this study was to survey of the curriculum quality of Actuarial science field. Case: University of SHahid Beheshti and Higher education institute of Eco insurance (according to viewpoint of students, alumni, employers and faculty members). Descriptive statistics (mean, tables, percentage, and frequency distribution) and inferential statistics (CHI SQUARE) were used to analyze the data. Six criteria considered for the Quality of curriculum: objectives, content, teaching and learning methods, space and facilities, Time, assessment of learning. Content, teaching and learning methods, space and facilities, assessment of learning criteria were relatively desirable level, objectives and time criterions were desirable level. The quality of curriculum of Actuarial Science field was relatively desirable level.
Abstract: In this paper by measuring the cutting forces the effect
of the tool shape and qualifications (sharp and worn cutting tools of
both vee and knife edge profile) and cutting conditions (depth of cut
and cutting speed) in the turning operation on the tool deflection and
cutting force is investigated. The workpiece material was mild steel
and the cutting tool was made of high speed steel. Cutting forces
were measured by a dynamometer (type P.E.I. serial No 154). The
dynamometer essentially consisted of a cantilever structure which
held the cutting tool. Deflection of the cantilever was measured by an
L.V.D.T (Mercer 122) deflection indicator. No cutting fluid was used
during the turning operations. A modern CNC lathe machine (Okuma
LH35-N) was used for the tests. It was noted that worn vee profile
tools tended to produce a greater increase in the vertical force
component than the axial component, whereas knife tools tended to
show a more pronounced increase in the axial component.
Abstract: A scalable QoS aware multicast deployment in
DiffServ networks has become an important research dimension in
recent years. Although multicasting and differentiated services are
two complementary technologies, the integration of the two
technologies is a non-trivial task due to architectural conflicts
between them. A popular solution proposed is to extend the
functionality of the DiffServ components to support multicasting. In
this paper, we propose an algorithm to construct an efficient QoSdriven
multicast tree, taking into account the available bandwidth per
service class. We also present an efficient way to provision the
limited available bandwidth for supporting heterogeneous users. The
proposed mechanism is evaluated using simulated tests. The
simulated result reveals that our algorithm can effectively minimize
the bandwidth use and transmission cost
Abstract: Following the loss of NASA's Space Shuttle
Columbia in 2003, it was determined that problems in the agency's
organization created an environment that led to the accident. One
component of the proposed solution resulted in the formation of the
NASA Engineering Network (NEN), a suite of information retrieval
and knowledge-sharing tools. This paper describes the
implementation of communities of practice, which are formed along
engineering disciplines. Communities of practice enable engineers to
leverage their knowledge and best practices to collaborate and take
information learning back to their jobs and embed it into the
procedures of the agency. This case study offers insight into using
traditional engineering disciplines for virtual collaboration, including
lessons learned during the creation and establishment of NASA-s
communities.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: The common bean is the most important grain legume for direct human consumption in the world and BCMV is one of the world's most serious bean diseases that can reduce yield and quality of harvested product. To determine the best tolerance index to BCMV and recognize tolerant genotypes, 2 experiments were conducted in field conditions. Twenty five common bean genotypes were sown in 2 separate RCB design with 3 replications under contamination and non-contamination conditions. On the basis of the results of indices correlations GMP, MP and HARM were determined as the most suitable tolerance indices. The results of principle components analysis indicated 2 first components totally explained 98.52% of variations among data. The first and second components were named potential yield and stress susceptible respectively. Based on the results of BCMV tolerance indices assessment and biplot analysis WA8563-4, WA8563-2 and Cardinal were the genotypes that exhibited potential seed yield under contamination and noncontamination conditions.
Abstract: Studies in neuroscience suggest that both global and
local feature information are crucial for perception and recognition of
faces. It is widely believed that local feature is less sensitive to
variations caused by illumination, expression and illumination. In
this paper, we target at designing and learning local features for face
recognition. We designed three types of local features. They are
semi-global feature, local patch feature and tangent shape feature.
The designing of semi-global feature aims at taking advantage of
global-like feature and meanwhile avoiding suppressing AdaBoost
algorithm in boosting weak classifies established from small local
patches. The designing of local patch feature targets at automatically
selecting discriminative features, and is thus different with traditional
ways, in which local patches are usually selected manually to cover
the salient facial components. Also, shape feature is considered in
this paper for frontal view face recognition. These features are
selected and combined under the framework of boosting algorithm
and cascade structure. The experimental results demonstrate that the
proposed approach outperforms the standard eigenface method and
Bayesian method. Moreover, the selected local features and
observations in the experiments are enlightening to researches in
local feature design in face recognition.
Abstract: Fuel and oxidant gas delivery plate, or fuel cell
plate, is a key component of a Proton Exchange Membrane (PEM)
fuel cell. To manufacture low-cost and high performance fuel cell
plates, advanced computer modeling and finite element structure
analysis are used as virtual prototyping tools for the optimization
of the plates at the early design stage. The present study examines
thermal stress analysis of the fuel cell plates that are produced
using a patented, low-cost fuel cell plate production technique
based on screen-printing. Design optimization is applied to
minimize the maximum stress within the plate, subject to strain
constraint with both geometry and material parameters as design
variables. The study reveals the characteristics of the printed
plates, and provides guidelines for the structure and material design
of the fuel cell plate.
Abstract: This paper deals with a numerical analysis of the
transient response of composite beams with strain rate dependent
mechanical properties by use of a finite difference method. The
equations of motion based on Timoshenko beam theory are derived.
The geometric nonlinearity effects are taken into account with von
Kármán large deflection theory. The finite difference method in
conjunction with Newmark average acceleration method is applied to
solve the differential equations. A modified progressive damage
model which accounts for strain rate effects is developed based on
the material property degradation rules and modified Hashin-type
failure criteria and added to the finite difference model. The
components of the model are implemented into a computer code in
Mathematica 6. Glass/epoxy laminated composite beams with
constant and strain rate dependent mechanical properties under
dynamic load are analyzed. Effects of strain rate on dynamic
response of the beam for various stacking sequences, load and
boundary conditions are investigated.
Abstract: This paper introduces a process for the module level integration of computer based systems. It is based on the Six Sigma Process Improvement Model, where the goal of the process is to improve the overall quality of the system under development. We also present a conceptual framework that shows how this process can be implemented as an integration solution. Finally, we provide a partial implementation of key components in the conceptual framework.
Abstract: In this research, the diabetes conditions of people (healthy, prediabete and diabete) were tried to be identified with noninvasive palm perspiration measurements. Data clusters gathered from 200 subjects were used (1.Individual Attributes Cluster and 2. Palm Perspiration Attributes Cluster). To decrase the dimensions of these data clusters, Principal Component Analysis Method was used. Data clusters, prepared in that way, were classified with Support Vector Machines. Classifications with highest success were 82% for Glucose parameters and 84% for HbA1c parametres.
Abstract: The flow of a third grade fluid in an orthogonal rheometer is studied. We employ the admissible velocity field proposed in [5]. We solve the problem and obtain the velocity field as well as the components for the Cauchy tensor. We compare the results with those from [9]. Some diagrams concerning the velocity and Cauchy stress components profiles are presented for different values of material constants and compared with the corresponding values for a linear viscous fluid.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: This paper presents the results of a study aimed at
establishing the temperature distribution during the welding of
magnesium alloy sheets by Pulsed Current Gas Tungsten Arc
Welding (PCGTAW) and Constant Current Gas Tungsten Arc
Welding (CCGTAW) processes. Pulsing of the GTAW welding
current influences the dimensions and solidification rate of the fused
zone, it also reduces the weld pool volume hence a narrower bead. In
this investigation, the base material considered was 2mm thin AZ 31
B magnesium alloy, which is finding use in aircraft, automobile and
high-speed train components. A finite element analysis was carried
out using ANSYS, and the results of the FEA were compared with
the experimental results. It is evident from this study that the finite
element analysis using ANSYS can be effectively used to model
PCGTAW process for finding temperature distribution.