Abstract: In order to Study the efficacy application of green
manure as chickpea pre plant, field experiments were carried out in
2007 and 2008 growing seasons. In this research the effects of
different strategies for soil fertilization were investigated on grain
yield and yield component, minerals, organic compounds and
cooking time of chickpea. Experimental units were arranged in splitsplit
plots based on randomized complete blocks with three
replications. Main plots consisted of (G1): establishing a mixed
vegetation of Vicia panunica and Hordeum vulgare and (G2):
control, as green manure levels. Also, five strategies for obtaining the
base fertilizer requirement including (N1): 20 t.ha-1 farmyard manure;
(N2): 10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate;
(N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5): 10 t.ha-1
farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1 triple super
phosphate were considered in sub plots. Furthermoree four levels of
biofertilizers consisted of (B1): Bacillus lentus + Pseudomonas
putida; (B2): Trichoderma harzianum; (B3): Bacillus lentus +
Pseudomonas putida + Trichoderma harzianum; and (B4): control
(without biofertilizers) were arranged in sub-sub plots. Results
showed that integrating biofertilizers (B3) and green manure (G1)
produced the highest grain yield. The highest amounts of yield were
obtained in G1×N5 interaction. Comparison of all 2-way and 3-way
interactions showed that G1N5B3 was determined as the superior
treatment. Significant increasing of N, P2O5, K2O, Fe and Mg content
in leaves and grains emphasized on superiority of mentioned
treatment because each one of these nutrients has an approved role in
chlorophyll synthesis and photosynthesis abilities of the crops. The
combined application of compost, farmyard manure and chemical
phosphorus (N5) in addition to having the highest yield, had the best
grain quality due to high protein, starch and total sugar contents, low
crude fiber and reduced cooking time.
Abstract: Hepatitis B and hepatitis C are among the most
significant hepatic infections all around the world that may lead to
hepatocellular carcinoma. This study is first time performed at the
blood transfussion centre of Omar hospital, Lahore. It aims to
determine the sero-prevalence of these diseases by screening the
apparently healthy blood donors who might be the carriers of HBV or
HCV and pose a high risk in the transmission. It also aims the
comparison between the sensitivity of two diagnostic tests;
chromatographic immunoassay – one step test device and Enzyme
Linked Immuno Sorbant Assay (ELISA). Blood serum of 855
apparently healthy blood donors was screened for Hepatitis B surface
antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0
and X2 (Chi-square) test were used for statistical analysis. The seroprevalence
of HCV was 8.07% by the device method and by ELISA
9.12% and that of HBV was 5.6% by the device and 6.43% by
ELISA. The unavailability of vaccination against HCV makes it more
prevalent. Comparing the two diagnostic methods, ELISA proved to
be more sensitive.
Abstract: Hydrogen diffusion is the main problem for corrosion fatigue in corrosive environment. In order to analyze the phenomenon, it is needed to understand their behaviors specially the hydrogen behavior during the diffusion. So, Hydrogen embrittlement and prediction its behavior as a main corrosive part of the fractions, needed to solve combinations of different equations mathematically. The main point to obtain the equation, having knowledge about the source of causing diffusion and running the atoms into materials, called driving force. This is produced by either gradient of electrical or chemical potential. In this work, we consider the gradient of chemical potential to obtain the property equation. In diffusion of atoms, some of them may be trapped but, it could be ignorable in some conditions. According to the phenomenon of hydrogen embrittlement, the thermodynamic and chemical properties of hydrogen are considered to justify and relate them to fracture mechanics. It is very important to get a stress intensity factor by using fugacity as a property of hydrogen or other gases. Although, the diffusive behavior and embrittlement event are common and the same for other gases but, for making it more clear, we describe it for hydrogen. This considering on the definite gas and describing it helps us to understand better the importance of this relation.
Abstract: This paper analyses the structural changes in
education sector since the introduction of liberalization policy in
India. This paper explains how the so-called non-profit trusts and
societies appropriated the liberalization policy and enhanced
themselves as new capitalist class in higher education sector. Over
the decades, the policy witnessed the role of private sector in terms
of maintaining market equilibrium. The state also witnessed the
incompatibility of the private sector in inculcating the values of
social justice. The most important consequence of the policy is to
witness the rise of new capitalist class and academic capitalism.
When the state came to realize that it no longer cope up with
market demands, it opens the entry of private sector in higher
education. Concessions and tax exemptions were provided to the
trusts and societies to establish higher education institutions. There
is a basic difference between western countries and India in
providing higher education by the trusts and societies. In western
countries the big business houses contributed their surplus
revenues to promote higher education and research as a
complementary service to society and nation. In India, several
entrepreneurs came up with business motive using education
sector. Over the period, they accumulated wealth at the cost of
students and concessions from the government. Four major results
can now be identified: production of manpower in view of market
demands; reduction of standards in higher education; bypassing the
values of social justice; and the rise of new capitalist class from the
business of education. This paper tries to substantiate these issues
with the inputs from case studies.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: In this paper, an efficient method for personal identification based on the pattern of human iris is proposed. It is composed of image acquisition, image preprocessing to make a flat iris then it is converted into eigeniris and decision is carried out using only reduction of iris in one dimension. By comparing the eigenirises it is determined whether two irises are similar. The results show that proposed method is quite effective.
Abstract: The goal of this paper is to examine the effects of laser
radiation on the skin wound healing using infrared thermography as
non-invasive method for the monitoring of the skin temperature
changes during laser treatment. Thirty Wistar rats were used in this
study. A skin lesion was performed at the leg on all rats. The animals
were exposed to laser radiation (λ = 670 nm, P = 15 mW, DP = 16.31
mW/cm2) for 600 s. Thermal images of wound were acquired before
and after laser irradiation. The results have demonstrated that the
tissue temperature decreases from 35.5±0.50°C in the first treatment
day to 31.3±0.42°C after the third treatment day. This value is close
to the normal value of the skin temperature and indicates the end of
the skin repair process. In conclusion, the improvements in the
wound healing following exposure to laser radiation have been
revealed by infrared thermography.
Abstract: In this research, STNEP is being studied considering network adequacy and limitation of investment cost by decimal codification genetic algorithm (DCGA). The goal is obtaining the maximum of network adequacy with lowest expansion cost for a specific investment. Finally, the proposed idea is applied to the Garvers 6-bus network. The results show that considering the network adequacy for solution of STNEP problem is caused that among of expansion plans for a determined investment, configuration which has relatively lower expansion cost and higher adequacy is proposed by GA based method. Finally, with respect to the curve of adequacy versus expansion cost it can be said that more optimal configurations for expansion of network are obtained with lower investment costs.
Abstract: The mathematical equation for Separation of the
binary aqueous solution is developed by using the Spiegler- Kedem
theory. The characteristics of a B-9 hollow fibre module of Du Pont
are determined by using these equations and their results are
compared with the experimental results of Ohya et al. The agreement
between these results is found to be excellent.
Abstract: This paper presents an evolutionary method for designing
electronic circuits and numerical methods associated with
monitoring systems. The instruments described here have been used
in studies of weather and climate changes due to global warming, and
also in medical patient supervision. Genetic Programming systems
have been used both for designing circuits and sensors, and also for
determining sensor parameters. The authors advance the thesis that
the software side of such a system should be written in computer
languages with a strong mathematical and logic background in order
to prevent software obsolescence, and achieve program correctness.
Abstract: Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: The performance of the Optical Code Division Multiplexing/ Wavelength Division Multiplexing (WDM/OCDM) technique for Optical Packet Switch is investigated. The impact on the performance of the impairment due to both Multiple Access Interference and Beat noise is studied. The Packet Loss Probability due to output packet contentions is evaluated as a function of the main switch and traffic parameters when Gold coherent optical codes are adopted. The Packet Loss Probability of the OCDM/WDM switch can reach 10-9 when M=16 wavelengths, Gold code of length L=511 and only 24 wavelength converters are used in the switch.
Abstract: This study was an investigation on the suitability of Lahar/HDPE composite as a primary material for low-cost smallscale biogas digesters. While sources of raw materials for biogas are abundant in the Philippines, cost of the technology has made the widespread utilization of this resource an indefinite proposition. Aside from capital economics, another problem arises with space requirements of current digester designs. These problems may be simultaneously addressed by fabricating digesters on a smaller, household scale to reach a wider market, and to use materials that may accommodate optimization of overall design and fabrication cost without sacrificing operational efficiency. This study involved actual fabrication of the Lahar/HDPE composite at varying composition and geometry, subsequent mechanical and thermal characterization, and implementation of Statistical Analysis to find intrinsic relationships between variables. From the results, Lahar/HDPE composite was found to be feasible for use as digester material from both mechanical and economic standpoints.
Abstract: This work presents a numerical simulation of the interaction of an incident shock wave propagates from the left to the right with a cone placed in a tube at shock. The Mathematical model is based on a non stationary, viscous and axisymmetric flow. The Discretization of the Navier-stokes equations is carried out by the finite volume method in the integral form along with the Flux Vector Splitting method of Van Leer. Here, adequate combination of time stepping parameter, CFL coefficient and mesh size level is selected to ensure numerical convergence. The numerical simulation considers a shock tube filled with air. The incident shock wave propagates to the right with a determined Mach number and crosses the cone by leaving behind it a stationary detached shock wave in front of the nose cone. This type of interaction is observed according to the time of flow.
Abstract: Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.
Abstract: The software evolution control requires a deep
understanding of the changes and their impact on different system
heterogeneous artifacts. And an understanding of descriptive
knowledge of the developed software artifacts is a prerequisite
condition for the success of the evolutionary process.
The implementation of an evolutionary process is to make changes
more or less important to many heterogeneous software artifacts such
as source code, analysis and design models, unit testing, XML
deployment descriptors, user guides, and others. These changes can
be a source of degradation in functional, qualitative or behavioral
terms of modified software. Hence the need for a unified approach
for extraction and representation of different heterogeneous artifacts
in order to ensure a unified and detailed description of heterogeneous
software artifacts, exploitable by several software tools and allowing
to responsible for the evolution of carry out the reasoning change
concerned.
Abstract: Heterogeneous repolarization causes dispersion of the T-wave and has been linked to arrhythmogenesis. Such heterogeneities appear due to differential expression of ionic currents in different regions of the heart, both in healthy and diseased animals and humans. Mice are important animals for the study of heart diseases because of the ability to create transgenic animals. We used our previously reported model of mouse ventricular myocytes to develop 2D mouse ventricular tissue model consisting of 14,000 cells (apical or septal ventricular myocytes) and to study the stability of action potential propagation and Ca2+ dynamics. The 2D tissue model was implemented as a FORTRAN program code for highperformance multiprocessor computers that runs on 36 processors. Our tissue model is able to simulate heterogeneities not only in action potential repolarization, but also heterogeneities in intracellular Ca2+ transients. The multicellular model reproduced experimentally observed velocities of action potential propagation and demonstrated the importance of incorporation of realistic Ca2+ dynamics for action potential propagation. The simulations show that relatively sharp gradients of repolarization are predicted to exist in 2D mouse tissue models, and they are primarily determined by the cellular properties of ventricular myocytes. Abrupt local gradients of channel expression can cause alternans at longer pacing basic cycle lengths than gradual changes, and development of alternans depends on the site of stimulation.
Abstract: The problems with high complexity had been the challenge in combinatorial problems. Due to the none-determined and polynomial characteristics, these problems usually face to unreasonable searching budget. Hence combinatorial optimizations attracted numerous researchers to develop better algorithms. In recent academic researches, most focus on developing to enhance the conventional evolutional algorithms and facilitate the local heuristics, such as VNS, 2-opt and 3-opt. Despite the performances of the introduction of the local strategies are significant, however, these improvement cannot improve the performance for solving the different problems. Therefore, this research proposes a meta-heuristic evolutional algorithm which can be applied to solve several types of problems. The performance validates BBEA has the ability to solve the problems even without the design of local strategies.
Abstract: One of the major features of hypermedia learning is its non-linear structure, allowing learners, the opportunity of flexible navigation to accommodate their own needs. Nevertheless, such flexibility can also cause problems such as insufficient navigation and disorientation for some learners, especially those with Field Dependent cognitive styles. As a result students learning performance can be deteriorated and in turn, they can have negative attitudes with hypermedia learning systems. It was suggested that visual elements can be used to compensate dilemmas. However, it is unclear whether these visual elements improve their learning or whether problems still exist. The aim of this study is to investigate the effect of students cognitive styles and visual elements on students learning performance and attitudes in hypermedia learning environment. Cognitive Style Analysis (CSA), Learning outcome in terms of pre and post-test, practical task, and Attitude Questionnaire (AQ) were administered to a sample of 60 university students. The findings revealed that FD students preformed equally to those of FI. Also, FD students experienced more disorientation in the hypermedia learning system where they depend a lot on the visual elements for navigation and orientation purposes. Furthermore, they had more positive attitudes towards the visual elements which escape them from experiencing navigation and disorientation dilemmas. In contrast, FI students were more comfortable, did not get disturbed or did not need some of the visual elements in the hypermedia learning system.
Abstract: Nowadays, more engineering systems are using some
kind of Artificial Intelligence (AI) for the development of their
processes. Some well-known AI techniques include artificial neural
nets, fuzzy inference systems, and neuro-fuzzy inference systems
among others. Furthermore, many decision-making applications base
their intelligent processes on Fuzzy Logic; due to the Fuzzy
Inference Systems (FIS) capability to deal with problems that are
based on user knowledge and experience. Also, knowing that users
have a wide variety of distinctiveness, and generally, provide
uncertain data, this information can be used and properly processed
by a FIS. To properly consider uncertainty and inexact system input
values, FIS normally use Membership Functions (MF) that represent
a degree of user satisfaction on certain conditions and/or constraints.
In order to define the parameters of the MFs, the knowledge from
experts in the field is very important. This knowledge defines the MF
shape to process the user inputs and through fuzzy reasoning and
inference mechanisms, the FIS can provide an “appropriate" output.
However an important issue immediately arises: How can it be
assured that the obtained output is the optimum solution? How can it
be guaranteed that each MF has an optimum shape? A viable solution
to these questions is through the MFs parameter optimization. In this
Paper a novel parameter optimization process is presented. The
process for FIS parameter optimization consists of the five simple
steps that can be easily realized off-line. Here the proposed process
of FIS parameter optimization it is demonstrated by its
implementation on an Intelligent Interface section dealing with the
on-line customization / personalization of internet portals applied to
E-commerce.