Abstract: Selection of maize (Zea mays) hybrids with wide adaptability across diverse farming environments is important, prior to recommending them to achieve a high rate of hybrid adoption. Grain yield of 14 maize hybrids, tested in a randomized completeblock design with four replicates across 22 environments in Iran, was analyzed using site regression (SREG) stability model. The biplot technique facilitates a visual evaluation of superior genotypes, which is useful for cultivar recommendation and mega-environment identification. The objectives of this study were (i) identification of suitable hybrids with both high mean performance and high stability (ii) to determine mega-environments for maize production in Iran. Biplot analysis identifies two mega-environments in this study. The first mega-environments included KRM, KSH, MGN, DZF A, KRJ, DRB, DZF B, SHZ B, and KHM, where G10 hybrid was the best performing hybrid. The second mega-environment included ESF B, ESF A, and SHZ A, where G4 hybrid was the best hybrid. According to the ideal-hybrid biplot, G10 hybrid was better than all other hybrids, followed by the G1 and G3 hybrids. These hybrids were identified as best hybrids that have high grain yield and high yield stability. GGE biplot analysis provided a framework for identifying the target testing locations that discriminates genotypes that are high yielding and stable.
Abstract: In this paper we present a novel approach for human
Body configuration based on the Silhouette. We propose to address
this problem under the Bayesian framework. We use an effective
Model based MCMC (Markov Chain Monte Carlo) method to solve
the configuration problem, in which the best configuration could be
defined as MAP (maximize a posteriori probability) in Bayesian
model. This model based MCMC utilizes the human body model to
drive the MCMC sampling from the solution space. It converses the
original high dimension space into a restricted sub-space constructed
by the human model and uses a hybrid sampling algorithm. We
choose an explicit human model and carefully select the likelihood
functions to represent the best configuration solution. The
experiments show that this method could get an accurate
configuration and timesaving for different human from multi-views.
Abstract: This study was investigated on sampling and
analyzing water quality in water reservoir & water tower installed in
two kind of residential buildings and school facilities. Data of water
quality was collected for correlation analysis with frequency of
sanitization of water reservoir through questioning managers of
building about the inspection charts recorded on equipment for water
reservoir. Statistical software packages (SPSS) were applied to the
data of two groups (cleaning frequency and water quality) for
regression analysis to determine the optimal cleaning frequency of
sanitization. The correlation coefficient (R) in this paper represented
the degree of correlation, with values of R ranging from +1 to -1.After
investigating three categories of drinking water users; this study found
that the frequency of sanitization of water reservoir significantly
influenced the water quality of drinking water. A higher frequency of
sanitization (more than four times per 1 year) implied a higher quality
of drinking water. Results indicated that sanitizing water reservoir &
water tower should at least twice annually for achieving the aim of
safety of drinking water.
Abstract: Ethanol is generally used as a therapeutic reagent against Hepatocellular carcinoma (HCC or hepatoma) worldwide, as it can induce Hepatocellular carcinoma cell apoptosis at low concentration through a multifactorial process regulated by several unknown proteins. This paper provides a simple and available proteomic strategy for exploring differentially expressed proteins in the apoptotic pathway. The appropriate concentrations of ethanol required to induce HepG2 cell apoptosis were first assessed by MTT assay, Gisma and fluorescence staining. Next, the central proteins involved in the apoptosis pathway processs were determined using 2D-PAGE, SDS-PAGE, and bio-software analysis. Finally the downregulation of two proteins, AFP and survivin, were determined by immunocytochemistry and reverse transcriptase PCR (RT-PCR) technology. The simple, useful method demonstrated here provides a new approach to proteomic analysis in key bio-regulating process including proliferation, differentiation, apoptosis, immunity and metastasis.
Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: We have measured the pressure drop and convective
heat transfer coefficient of water – based AL(25nm),AL2O3(30nm)
and CuO(50nm) Nanofluids flowing through a uniform heated
circular tube in the fully developed laminar flow regime. The
experimental results show that the data for Nanofluids friction factor
show a good agreement with analytical prediction from the Darcy's
equation for single-phase flow. After reducing the experimental
results to the form of Reynolds, Rayleigh and Nusselt numbers. The
results show the local Nusselt number and temperature have
distribution with the non-dimensional axial distance from the tube
entry. Study decided that thenNanofluid as Newtonian fluids through
the design of the linear relationship between shear stress and the rate
of stress has been the study of three chains of the Nanofluid with
different concentrations and where the AL, AL2O3 and CuO – water
ranging from (0.25 - 2.5 vol %). In addition to measuring the four
properties of the Nanofluid in practice so as to ensure the validity of
equations of properties developed by the researchers in this area and
these properties is viscosity, specific heat, and density and found that
the difference does not exceed 3.5% for the experimental equations
between them and the practical. The study also demonstrated that the
amount of the increase in heat transfer coefficient for three types of
Nano fluid is AL, AL2O3, and CuO – Water and these ratios are
respectively (45%, 32%, 25%) with insulation and without insulation
(36%, 23%, 19%), and the statement of any of the cases the best
increase in heat transfer has been proven that using insulation is
better than not using it. I have been using three types of Nano
particles and one metallic Nanoparticle and two oxide Nanoparticle
and a statement, whichever gives the best increase in heat transfer.
Abstract: One of the main advantages of the LO paradigm is to
allow the availability of good quality, shareable learning material
through the Web. The effectiveness of the retrieval process requires a
formal description of the resources (metadata) that closely fits the
user-s search criteria; in spite of the huge international efforts in this
field, educational metadata schemata often fail to fulfil this
requirement. This work aims to improve the situation, by the
definition of a metadata model capturing specific didactic features of
shareable learning resources. It classifies LOs into “teacher-oriented"
and “student-oriented" categories, in order to describe the role a LO
is to play when it is integrated into the educational process. This
article describes the model and a first experimental validation process
that has been carried out in a controlled environment.
Abstract: 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGR) catalyzes the conversion of HMG-CoA to mevalonate using NADPH and the enzyme is involved in rate-controlling step of mevalonate. Inhibition of HMGR is considered as effective way to lower cholesterol levels so it is drug target to treat hypercholesterolemia, major risk factor of cardiovascular disease. To discover novel HMGR inhibitor, we performed structure-based pharmacophore modeling combined with molecular dynamics (MD) simulation. Four HMGR inhibitors were used for MD simulation and representative structure of each simulation were selected by clustering analysis. Four structure-based pharmacophore models were generated using the representative structure. The generated models were validated used in virtual screening to find novel scaffolds for inhibiting HMGR. The screened compounds were filtered by applying drug-like properties and used in molecular docking. Finally, four hit compounds were obtained and these complexes were refined using energy minimization. These compounds might be potential leads to design novel HMGR inhibitor.
Abstract: The problem of robust stability and robust stabilization for a class of discrete-time uncertain systems with time delay is investigated. Based on Tchebychev inequality, by constructing a new augmented Lyapunov function, some improved sufficient conditions ensuring exponential stability and stabilization are established. These conditions are expressed in the forms of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. Compared with some previous results derived in the literature, the new obtained criteria have less conservatism. Two numerical examples are provided to demonstrate the improvement and effectiveness of the proposed method.
Abstract: This paper explores the effectiveness of machine
learning techniques in detecting firms that issue fraudulent financial
statements (FFS) and deals with the identification of factors
associated to FFS. To this end, a number of experiments have been
conducted using representative learning algorithms, which were
trained using a data set of 164 fraud and non-fraud Greek firms in the
recent period 2001-2002. The decision of which particular method to
choose is a complicated problem. A good alternative to choosing
only one method is to create a hybrid forecasting system
incorporating a number of possible solution methods as components
(an ensemble of classifiers). For this purpose, we have implemented
a hybrid decision support system that combines the representative
algorithms using a stacking variant methodology and achieves better
performance than any examined simple and ensemble method. To
sum up, this study indicates that the investigation of financial
information can be used in the identification of FFS and underline the
importance of financial ratios.
Abstract: Grid composite structures have many applications in aerospace industry in which deal with transverse loadings abundantly. In present paper a stiffened composite cylindrical shell with clamped-free boundary condition under transverse end load experimentally and numerically was studied. Some electrical strain gauges were employed to measure the strains. Also a finite element analysis was done for validation of experimental result. The FEM software used was ANSYS11. In addition, the results between stiffened composite shell and unstiffened composite shell were compared. It was observed that intersection of two stiffeners has an important effect in decrease of stress in the shell. Fairly good agreements were observed between the numerical and the measured results. According to recent studies about grid composite structures, it should be noted that any investigation like this research has not been reported.
Abstract: In any distributed systems, process scheduling plays a
vital role in determining the efficiency of the system. Process scheduling algorithms are used to ensure that the components of the
system would be able to maximize its utilization and able to complete all the processes assigned in a specified period of time.
This paper focuses on the development of comparative simulator for distributed process scheduling algorithms. The objectives of the works that have been carried out include the development of the
comparative simulator, as well as to implement a comparative study
between three distributed process scheduling algorithms; senderinitiated,
receiver-initiated and hybrid sender-receiver-initiated
algorithms. The comparative study was done based on the Average Waiting Time (AWT) and Average Turnaround Time (ATT) of the
processes involved. The simulation results show that the performance of the algorithms depends on the number of nodes in the system.
Abstract: The separation of dissolved gas including dissolved oxygen can be used in breathing for a human under water. When one is suddenly wrecked or meets a tsunami, one is instantly drowned and cannot breathe under water. To avoid this crisis, when we meet waves, the dissolved gas separated from water by wave is used, while air can be used to breathe when we are about to escape from water. In this thesis, we investigated the separation characteristics of dissolved gas using the pipe type of hollow fiber membrane with polypropylene and the nude type of one with polysulfone. The hollow fiber membranes with good characteristics under water are used to separate the dissolved gas. The hollow fiber membranes with good characteristics in an air are used to transfer air. The combination of membranes with good separation characteristics under water and good transferring one in an air is used to breathe instantly under water to be alive at crisis. These results showed that polypropylene represented better performance than polysulfone under both of air and water conditions.
Abstract: All around the world pulp and paper industries are the
biggest plant production with the environmental pollution as the
biggest challenge facing the pulp manufacturing operations. The
concern among these industries is to produce a high volume of papers
with the high quality standard and of low cost without affecting the
environment. This result obtained from this bleaching study show
that the activation of peroxide was an effective method of reducing
the total applied charge of chlorine dioxide which is harmful to our
environment and also show that softwood and hardwood Kraft pulps
responded linearly to the peroxide treatments. During the bleaching
process the production plant produce chlorines. Under the trial stages
chloride dioxide has been reduced by 3 kg/ton to reduce the
brightness from 65% ISO to 60% ISO of pulp and the dosing point
returned to the E stage charges by pre-treating Kraft pulps with
hydrogen peroxide. The pulp and paper industry has developed
elemental chlorine free (ECF) and totally chlorine free (TCF)
bleaching, in their quest for being environmental friendly, they have
been looking at ways to turn their ECF process into a TCF process
while still being competitive. This prompted the research to
investigate the capability of the hydrogen peroxide as catalyst to
reduce chloride dioxide.
Abstract: The previous study of new metal gasket that contact
width and contact stress an important design parameter for optimizing
metal gasket performance. The optimum design based on an elastic
and plastic contact stress was founded. However, the influence of
flange surface roughness had not been investigated thoroughly. The
flange has many kinds of surface roughness. In this study, we
conducted a gasket model include a flange surface roughness effect. A
finite element method was employed to develop simulation solution. A
uniform quadratic mesh used for meshing the gasket material and a
gradually quadrilateral mesh used for meshing the flange. The gasket
model was simulated by using two simulation stages which is forming
and tightening simulation. A simulation result shows that a smoother
of surface roughness has higher slope for force per unit length. This
mean a squeezed against between flange and gasket will be strong. The
slope of force per unit length for gasket 400-MPa mode was higher
than the gasket 0-MPa mode.
Abstract: This work considered the thermodynamic feasibility
of scrubbing volatile organic compounds into biodiesel in view of
designing a gas treatment process with this absorbent. A detailed
vapour – liquid equilibrium investigation was performed using the
original UNIFAC group contribution method. The four biodiesels
studied in this work are methyl oleate, methyl palmitate, methyl
linolenate and ethyl stearate. The original UNIFAC procedure was
used to estimate the infinite dilution activity coefficients of 13
selected volatile organic compounds in the biodiesels. The
calculations were done at the VOC mole fraction of 9.213x10-8. Ethyl
stearate gave the most favourable phase equilibrium. A close
agreement was found between the infinite dilution activity coefficient
of toluene found in this work and those reported in literature.
Thermodynamic models can efficiently be used to calculate vast
amount of phase equilibrium behaviour using limited number of
experimental data.
Abstract: Recent years have seen a growing trend towards the
integration of multiple information sources to support large-scale
prediction of protein-protein interaction (PPI) networks in model
organisms. Despite advances in computational approaches, the
combination of multiple “omic" datasets representing the same type
of data, e.g. different gene expression datasets, has not been
rigorously studied. Furthermore, there is a need to further investigate
the inference capability of powerful approaches, such as fullyconnected
Bayesian networks, in the context of the prediction of PPI
networks. This paper addresses these limitations by proposing a
Bayesian approach to integrate multiple datasets, some of which
encode the same type of “omic" data to support the identification of
PPI networks. The case study reported involved the combination of
three gene expression datasets relevant to human heart failure (HF).
In comparison with two traditional methods, Naive Bayesian and
maximum likelihood ratio approaches, the proposed technique can
accurately identify known PPI and can be applied to infer potentially
novel interactions.
Abstract: Since 1984 many schemes have been proposed for
digital signature protocol, among them those that based on discrete
log and factorizations. However a new identification scheme based
on iterated function (IFS) systems are proposed and proved to be
more efficient. In this study the proposed identification scheme is
transformed into a digital signature scheme by using a one way hash
function. It is a generalization of the GQ signature schemes. The
attractor of the IFS is used to obtain public key from a private one,
and in the encryption and decryption of a hash function. Our aim is
to provide techniques and tools which may be useful towards
developing cryptographic protocols. Comparisons between the
proposed scheme and fractal digital signature scheme based on RSA
setting, as well as, with the conventional Guillou-Quisquater
signature, and RSA signature schemes is performed to prove that, the
proposed scheme is efficient and with high performance.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: The spatial variation in plant species associated with intercropping is intended to reduce resource competition between species and increase yield potential. A field experiment was carried out on corn (Zea mays L.) and soybean (Glycine max L.) intercropping in a replacement series experiment with weed contamination consist of: weed free, infestation of redroot pigweed, infestation of jimsonweed and simultaneous infestation of redroot pigweed and jimsonweed in Karaj, Iran during 2007 growing season. The experimental design was a randomized complete block in factorial experiment with replicated thrice. Significant (P≤0.05) differences were observed in yield in intercropping. Corn yield was higher in intercropping, but soybean yield was significantly reduced by corn when intercropped. However, total productivity and land use efficiency were high under the intercropping system even in contamination of either species of weeds. Aggressivity of corn relative to soybean revealed the greater competitive ability of corn than soybean. Land equivalent ratio (LER) more than 1 in all treatments attributed to intercropping advantages and was highest in 50: 50 (corn/soybean) in weed free. These findings suggest that intercropping corn and soybean increase total productivity per unit area and improve land use efficiency. Considering the experimental findings, corn-soybean intercropping (50:50) may be recommended for yield advantage, more efficient utilization of resources, and weed suppression as a biological control.