Influence of Drought on Yield and Yield Components in White Bean

In order to study seed yield and seed yield components in bean under reduced irrigation condition and assessment drought tolerance of genotypes, 15 lines of White beans were evaluated in two separate RCB design with 3 replications under stress and non stress conditions. Analysis of variance showed that there were significant differences among varieties in terms of traits under study, indicating the existence of genetic variation among varieties. The results indicate that drought stress reduced seed yield, number of seed per plant, biological yield and number of pod in White been. In non stress condition, yield was highly correlated with the biological yield, whereas in stress condition it was highly correlated with harvest index. Results of stepwise regression showed that, selection can we done based on, biological yield, harvest index, number of seed per pod, seed length, 100 seed weight. Result of path analysis showed that the highest direct effect, being positive, was related to biological yield in non stress and to harvest index in stress conditions. Factor analysis were accomplished in stress and nonstress condition a, there were 4 factors that explained more than 76 percent of total variations. We used several selection indices such as Stress Susceptibility Index ( SSI ), Geometric Mean Productivity ( GMP ), Mean Productivity ( MP ), Stress Tolerance Index ( STI ) and Tolerance Index ( TOL ) to study drought tolerance of genotypes, we found that the best Stress Index for selection tolerance genotypes were STI, GMP and MP were the greatest correlations between these Indices and seed yield under stress and non stress conditions. In classification of genotypes base on phenotypic characteristics, using cluster analysis ( UPGMA ), all allels classified in 5 separate groups in stress and non stress conditions.

Coding of DWT Coefficients using Run-length Coding and Huffman Coding for the Purpose of Color Image Compression

In present paper we proposed a simple and effective method to compress an image. Here we found success in size reduction of an image without much compromising with it-s quality. Here we used Haar Wavelet Transform to transform our original image and after quantization and thresholding of DWT coefficients Run length coding and Huffman coding schemes have been used to encode the image. DWT is base for quite populate JPEG 2000 technique.

Multi-Walled Carbon Nanotubes/Polyacrylonitrile Composite as Novel Semi-Permeable Mixed Matrix Membrane in Reverse Osmosis Water Treatment Process

novel and simple method is introduced for rapid and highly efficient water treatment by reverse osmosis (RO) method using multi-walled carbon nanotubes (MWCNTs) / polyacrylonitrile (PAN) polymer as a flexible, highly efficient, reusable and semi-permeable mixed matrix membrane (MMM). For this purpose, MWCNTs were directly synthesized and on-line purified by chemical vapor deposition (CVD) process, followed by directing the MWCNT bundles towards an ultrasonic bath, in which PAN polymer was simultaneously suspended inside a solid porous silica support in water at temperature to ~70 οC. Fabrication process of MMM was finally completed by hot isostatic pressing (HIP) process. In accordance with the analytical figures of merit, the efficiency of fabricated MMM was ~97%. The rate of water treatment process was also evaluated to 6.35 L min-1. The results reveal that, the CNT-based MMM is suitable for rapid treatment of different forms of industrial, sea, drinking and well water samples.

Suspended Matter Model on Alsat-1 Image by MLP Network and Mathematical Morphology: Prototypes by K-Means

In this article, we propose a methodology for the characterization of the suspended matter along Algiers-s bay. An approach by multi layers perceptron (MLP) with training by back propagation of the gradient optimized by the algorithm of Levenberg Marquardt (LM) is used. The accent was put on the choice of the components of the base of training where a comparative study made for four methods: Random and three alternatives of classification by K-Means. The samples are taken from suspended matter image, obtained by analytical model based on polynomial regression by taking account of in situ measurements. The mask which selects the zone of interest (water in our case) was carried out by using a multi spectral classification by ISODATA algorithm. To improve the result of classification, a cleaning of this mask was carried out using the tools of mathematical morphology. The results of this study presented in the forms of curves, tables and of images show the founded good of our methodology.

The Sublimation Energy of Metal versus Temperature and Pressure and its Influence on Blow-off Impulse

Based on the thermodynamic theory, the dependence of sublimation energy of metal on temperature and pressure is discussed, and the results indicate that the sublimation energy decreases linearly with the increase of temperature and pressure. Combined with this result, the blow-off impulse of aluminum induced by pulsed X-ray is simulated by smoothed particle hydrodynamics (SPH) method. The numerical results show that, while the change of sublimation energy with temperature and pressure is considered, the blow-off impulse of aluminum is larger than the case that the sublimation energy is assumed to be a constant.

Application of Mutual Information based Least dependent Component Analysis (MILCA) for Removal of Ocular Artifacts from Electroencephalogram

The electrical potentials generated during eye movements and blinks are one of the main sources of artifacts in Electroencephalogram (EEG) recording and can propagate much across the scalp, masking and distorting brain signals. In recent times, signal separation algorithms are used widely for removing artifacts from the observed EEG data. In this paper, a recently introduced signal separation algorithm Mutual Information based Least dependent Component Analysis (MILCA) is employed to separate ocular artifacts from EEG. The aim of MILCA is to minimize the Mutual Information (MI) between the independent components (estimated sources) under a pure rotation. Performance of this algorithm is compared with eleven popular algorithms (Infomax, Extended Infomax, Fast ICA, SOBI, TDSEP, JADE, OGWE, MS-ICA, SHIBBS, Kernel-ICA, and RADICAL) for the actual independence and uniqueness of the estimated source components obtained for different sets of EEG data with ocular artifacts by using a reliable MI Estimator. Results show that MILCA is best in separating the ocular artifacts and EEG and is recommended for further analysis.

Adaptive Early Packet Discarding Policy Based on Two Traffic Classes

Unlike the best effort service provided by the internet today, next-generation wireless networks will support real-time applications. This paper proposes an adaptive early packet discard (AEPD) policy to improve the performance of the real time TCP traffic over ATM networks and avoid the fragmentation problem. Three main aspects are incorporated in the proposed policy. First, providing quality-of-service (QoS) guaranteed for real-time applications by implementing a priority scheduling. Second, resolving the partially corrupted packets problem by differentiating the buffered cells of one packet from another. Third, adapting a threshold dynamically using Fuzzy logic based on the traffic behavior to maintain a high throughput under a variety of load conditions. The simulation is run for two priority classes of the input traffic: real time and non-real time classes. Simulation results show that the proposed AEPD policy improves throughput and fairness over that using static threshold under the same traffic conditions.

Exergy Analysis of a Cogeneration Plant

Cogeneration may be defined as a system which contains electricity production and regain of the thermo value of exhaust gases simultaneously. The examination is based on the data-s of an active cogeneration plant. This study, it is aimed to determine which component of the system should be revised first to raise the efficiency and decrease the loss of exergy. For this purpose, second law analysis of thermodynamics is applied to each component due to consider the effects of environmental conditions and take the quality of energy into consideration as well as the quantity of it. The exergy balance equations are produced and exergy loss is calculated for each component. 44,44 % loss of exergy in heat exchanger, 29,59 % in combustion chamber, 18,68 % in steam boiler, 5,25 % in gas turbine and 2,03 % in compressor is calculated.

The Global Crisis, Remittance Transfers, and Livelihoods of the Poor

With the global financial crisis turning into what more and more appears to be a prolonged “Great Recession", we are witnessing marked reductions in remittance transfers to developing countries with the likely possibility that overall flows will decline even further in the near future. With countless families reliant on remittance inflows as a source of income maintaining their economic livelihood, a reduction would put many at risk of falling below or deeper into poverty. Recognizing the importance of remittance inflows as a lifeline to the poor, policy should aim to (1) reduce the barriers to remit in both sending and receiving nations thus easing the decline in transfers; (2) leverage the development impacts of remittances; and (3) buffer vulnerable groups dependent on remittance transfers as a source of livelihood through sound countercyclical macroeconomic policies.

Some Physico-Chemical Characteristics and Mineral Contents of Gilaburu (Viburnum opulus L.) Fruits in Turkey

Gilaburu (Viburnum opulus L.) grown naturally in Anatolia. In this study, some physico-chemical (sugar, acid, protein, crude fat, crude fiber, ash etc.) characteristics and mineral composition of Gilaburu fruit have been investigated. The length, width, thickness, weight, total soluble solid, protein, crude ash, crude fiber and crude oil of fruit were found to be 1.12 cm, 1.58 cm, 1.87 cm, 0.87 g, 14.73 %, 0.2 %, 0.11 %, 6.56 % and 0.4 %, respectively. The seed of fruit mean weight, length, width and thickness were determinated as 0.08 g, 7.76 cm, 7.67 cm and 1.66, respectively. In addition 27 mineral elements (Al, Mg, Na, Ba, Ca, Ni, Cd, P, Cr, Pb, S, Cu, Se, Fe, K, Sr, Li, Z, V, Ag, Bi, Co, Mn, B, Ga, In, Ti) were analyzed. Gilaburu (Viburnum opulus L.) fruit was richest in potassium (10764.764 ppm), Mg (1289.088 ppm) and P (1304.169 ppm).

An Optimization of the New Die Design of Sheet Hydroforming by Taguchi Method

During the last few years, several sheet hydroforming processes have been introduced. Despite the advantages of these methods, they have some limitations. Of the processes, the two main ones are the standard hydroforming and hydromechanical deep drawing. A new sheet hydroforming die set was proposed that has the advantages of both processes and eliminates their limitations. In this method, a polyurethane plate was used as a part of the die-set to control the blank holder force. This paper outlines the Taguchi optimization methodology, which is applied to optimize the effective parameters in forming cylindrical cups by the new die set of sheet hydroforming process. The process parameters evaluated in this research are polyurethane hardness, polyurethane thickness, forming pressure path and polyurethane hole diameter. The design of experiments based upon L9 orthogonal arrays by Taguchi was used and analysis of variance (ANOVA) was employed to analyze the effect of these parameters on the forming pressure. The analysis of the results showed that the optimal combination for low forming pressure is harder polyurethane, bigger diameter of polyurethane hole and thinner polyurethane. Finally, the confirmation test was derived based on the optimal combination of parameters and it was shown that the Taguchi method is suitable to examine the optimization process.

Applying Wavelet Entropy Principle in Fault Classification

The ability to detect and classify the type of fault plays a great role in the protection of power system. This procedure is required to be precise with no time consumption. In this paper detection of fault type has been implemented using wavelet analysis together with wavelet entropy principle. The simulation of power system is carried out using PSCAD/EMTDC. Different types of faults were studied obtaining various current waveforms. These current waveforms were decomposed using wavelet analysis into different approximation and details. The wavelet entropy of such decompositions is analyzed reaching a successful methodology for fault classification. The suggested approach is tested using different fault types and proven successful identification for the type of fault.

An Efficient Protocol for Cyclic Somatic Embryogenesis in Neem (Azadirachta indica A Juss.)

Neem is a highly heterozygous and commercially important perennial plant. Conventionally, it is propagated by seeds which loose viability within two weeks. Strictly cross pollinating nature of the plant causes serious barrier to the genetic improvement by conventional methods. Alternative methods of tree improvement such as somatic hybridization, mutagenesis and genetic transformation require an efficient in vitro plant regeneration system. In this regard, somatic embryogenesis particularly secondary somatic embryogenesis may offer an effective system for large scale plant propagation without affecting the clonal fidelity of the regenerants. It can be used for synthetic seed production, which further bolsters conservation of this tree species which is otherwise very difficult The present report describes the culture conditions necessary to induce and maintain repetitive somatic embryogenesis, for the first time, in neem. Out of various treatments tested, the somatic embryos were induced directly from immature zygotic embryos of neem on MS + TDZ (0.1 μM) + ABA (4 μM), in more than 76 % cultures. Direct secondary somatic embryogenesis occurred from primary somatic embryos on MS + IAA (5 μM) + GA3 (5 μM) in 12.5 % cultures. Embryogenic competence of the explant as well as of the primary embryos was maintained for a long period by repeated subcultures at frequent intervals. A maximum of 10 % of these somatic embryos were converted into plantlets.

Development of Content Management System with Animated Graph

Animated graph gives some good impressions in presenting information. However, not many people are able to produce it because the process of generating an animated graph requires some technical skills. This work presents Content Management System with Animated Graph (CMS-AG). It is a webbased system enabling users to produce an effective and interactive graphical report in a short time period. It allows for three levels of user authentication, provides update profile, account management, template management, graph management, and track changes. The system development applies incremental development approach, object-oriented concepts and Web programming technologies. The design architecture promotes new technology of reporting. It also helps user cut off unnecessary expenses, save time and learn new things on different levels of users. In this paper, the developed system is described.

Level of Service Based Methodology for Municipal Infrastructure Management

Development of levels of service in municipal context is a flexible vehicle to assist in performing quality-cost trade-off analysis for municipal services. This trade-off depends on the willingness of a community to pay as well as on the condition of the assets. Community perspective of the performance of an asset from service point of view may be quite different from the municipality perspective of the performance of the same asset from condition point of view. This paper presents a three phased level of service based methodology for water mains that consists of :1)development of an Analytical Hierarchy model of level of service 2) development of Fuzzy Weighted Sum model of water main condition index and 3) deriving a Fuzzy logic based function that maps level of service to asset condition index. This mapping will assist asset managers in quantifying condition improvement requirement to meet service goals and to make more informed decisions on interventions and relayed priorities.

Optical Characterization of a Microwave Plasma Torch for Hydrogen Production

Hydrogen sulfide (H2S) is a very toxic gas that is produced in very large quantities in the oil and gas industry. It cannot be flared to the atmosphere and Claus process based gas plants are used to recover the sulfur and convert the hydrogen to water. In this paper, we present optical characterization of an atmospheric pressure microwave plasma torch for H2S dissociation into hydrogen and sulfur. The torch is operated at 2.45 GHz with power up to 2 kW. Three different gases can simultaneously be injected in the plasma torch. Visual imaging and optical emission spectroscopy are used to characterize the plasma for varying gas flow rates and microwave power. The plasma length, emission spectra and temperature are presented. The obtained experimental results validate our earlier published simulation results of plasma torch.

A Type-2 Fuzzy Adaptive Controller of a Class of Nonlinear System

In this paper we propose a robust adaptive fuzzy controller for a class of nonlinear system with unknown dynamic. The method is based on type-2 fuzzy logic system to approximate unknown non-linear function. The design of the on-line adaptive scheme of the proposed controller is based on Lyapunov technique. Simulation results are given to illustrate the effectiveness of the proposed approach.

Clustering Unstructured Text Documents Using Fading Function

Clustering unstructured text documents is an important issue in data mining community and has a number of applications such as document archive filtering, document organization and topic detection and subject tracing. In the real world, some of the already clustered documents may not be of importance while new documents of more significance may evolve. Most of the work done so far in clustering unstructured text documents overlooks this aspect of clustering. This paper, addresses this issue by using the Fading Function. The unstructured text documents are clustered. And for each cluster a statistics structure called Cluster Profile (CP) is implemented. The cluster profile incorporates the Fading Function. This Fading Function keeps an account of the time-dependent importance of the cluster. The work proposes a novel algorithm Clustering n-ary Merge Algorithm (CnMA) for unstructured text documents, that uses Cluster Profile and Fading Function. Experimental results illustrating the effectiveness of the proposed technique are also included.

Estimation of the Bit Side Force by Using Artificial Neural Network

Horizontal wells are proven to be better producers because they can be extended for a long distance in the pay zone. Engineers have the technical means to forecast the well productivity for a given horizontal length. However, experiences have shown that the actual production rate is often significantly less than that of forecasted. It is a difficult task, if not impossible to identify the real reason why a horizontal well is not producing what was forecasted. Often the source of problem lies in the drilling of horizontal section such as permeability reduction in the pay zone due to mud invasion or snaky well patterns created during drilling. Although drillers aim to drill a constant inclination hole in the pay zone, the more frequent outcome is a sinusoidal wellbore trajectory. The two factors, which play an important role in wellbore tortuosity, are the inclination and side force at bit. A constant inclination horizontal well can only be drilled if the bit face is maintained perpendicular to longitudinal axis of bottom hole assembly (BHA) while keeping the side force nil at the bit. This approach assumes that there exists no formation force at bit. Hence, an appropriate BHA can be designed if bit side force and bit tilt are determined accurately. The Artificial Neural Network (ANN) is superior to existing analytical techniques. In this study, the neural networks have been employed as a general approximation tool for estimation of the bit side forces. A number of samples are analyzed with ANN for parameters of bit side force and the results are compared with exact analysis. Back Propagation Neural network (BPN) is used to approximation of bit side forces. Resultant low relative error value of the test indicates the usability of the BPN in this area.