Assessing the Impact of Contour Strips of Perennial Grass with Bio-fuel Potentials on Aquatic Environment

The use of contour strips of perennial vegetation with bio-fuel potential can improve surface water quality by reducing NO3-N and sediment outflow from cropland to surface water-bodies. It also has economic benefits of producing ethanol. In this study, The Soil and Water Assessment Tool (SWAT) model was applied to a watershed in Iowa, USA to examine the effectiveness of contour strips of switch grass in reducing the NO3-N outflows from crop fields to rivers or lakes. Numerical experiments were conducted to identify potential subbasins in the watershed that have high water quality impact, and to examine the effects of strip size on NO3-N reduction under various meteorological conditions, i.e. dry, average and wet years. Useful information was obtained for the evaluation of economic feasibility of growing switch grass for bio-fuel in contour strips. The results can assist in cost-benefit analysis and decisionmaking in best management practices for environmental protection.

A Comparative Study of Turbulence Models Performance for Turbulent Flow in a Planar Asymmetric Diffuser

This paper presents a computational study of the separated flow in a planer asymmetric diffuser. The steady RANS equations for turbulent incompressible fluid flow and six turbulence closures are used in the present study. The commercial software code, FLUENT 6.3.26, was used for solving the set of governing equations using various turbulence models. Five of the used turbulence models are available directly in the code while the v2-f turbulence model was implemented via User Defined Scalars (UDS) and User Defined Functions (UDF). A series of computational analysis is performed to assess the performance of turbulence models at different grid density. The results show that the standard k-ω, SST k-ω and v2-f models clearly performed better than other models when an adverse pressure gradient was present. The RSM model shows an acceptable agreement with the velocity and turbulent kinetic energy profiles but it failed to predict the location of separation and attachment points. The standard k-ε and the low-Re k- ε delivered very poor results.

Prediction of Soil Exchangeable Sodium Ratio Based on Soil Sodium Adsorption Ratio

Researchers have long had trouble in measurement of Exchangeable Sodium Ratio (ESR) at salt-affected soils. this parameter are often determined using laborious and time consuming laboratory tests, but it may be more appropriate and economical to develop a method which uses a more simple soil salinity index. The aim of this study was to determine the relationship between exchangeable sodium ratio (ESR) and sodium adsorption ratio (SAR) in some salt-affected soils of Khuzestan plain. To this purpose, two experimental areas (S1, S2) of Khuzestan province-IRAN were selected and four treatments with three replications by series of double rings were applied. The treatments were included 25cm, 50cm, 75cm and 100cm water application. The statistical results of the study indicated that in order to predict soil ESR based on soil SAR the linear regression model ESR=0.2048+0.0066 SAR (R2=0.53) & ESR=0.0564+0.0171 SAR (R2=0.76) can be recommended in Pilot S1 and S2 respectively.

A Data Warehouse System to Help Assist Breast Cancer Screening in Diagnosis, Education and Research

Early detection of breast cancer is considered as a major public health issue. Breast cancer screening is not generalized to the entire population due to a lack of resources, staff and appropriate tools. Systematic screening can result in a volume of data which can not be managed by present computer architecture, either in terms of storage capabilities or in terms of exploitation tools. We propose in this paper to design and develop a data warehouse system in radiology-senology (DWRS). The aim of such a system is on one hand, to support this important volume of information providing from multiple sources of data and images and for the other hand, to help assist breast cancer screening in diagnosis, education and research.

Emergency Response Plan Establishment and Computerization through the Analysis of the Disasters Occurring on Long-Span Bridges by Type

In this paper, a strategy for long-span bridge disaster response was developed, divided into risk analysis, business impact analysis, and emergency response plan. At the risk analysis stage, the critical risk was estimated. The critical risk was “car accident."The critical process by critical-risk classification was assessed at the business impact analysis stage. The critical process was the task related to the road conditions and traffic safety. Based on the results of the precedent analysis, an emergency response plan was established. By making the order of the standard operating procedures clear, an effective plan for dealing with disaster was formulated. Finally, a prototype software was developed based on the research findings. This study laid the foundation of an information-technology-based disaster response guideline and is significant in that it computerized the disaster response plan to improve the plan-s accessibility.

Bandwidth Optimization through Dynamic Routing in ATM Networks: Genetic Algorithm and Tabu Search Approach

Asynchronous Transfer Mode (ATM) is widely used in telecommunications systems to send data, video and voice at a very high speed. In ATM network optimizing the bandwidth through dynamic routing is an important consideration. Previous research work shows that traditional optimization heuristics result in suboptimal solution. In this paper we have explored non-traditional optimization technique. We propose comparison of two such algorithms - Genetic Algorithm (GA) and Tabu search (TS), based on non-traditional Optimization approach, for solving the dynamic routing problem in ATM networks which in return will optimize the bandwidth. The optimized bandwidth could mean that some attractive business applications would become feasible such as high speed LAN interconnection, teleconferencing etc. We have also performed a comparative study of the selection mechanisms in GA and listed the best selection mechanism and a new initialization technique which improves the efficiency of the GA.

Phosphine Mortality Estimation for Simulation of Controlling Pest of Stored Grain: Lesser Grain Borer (Rhyzopertha dominica)

There is a world-wide need for the development of sustainable management strategies to control pest infestation and the development of phosphine (PH3) resistance in lesser grain borer (Rhyzopertha dominica). Computer simulation models can provide a relatively fast, safe and inexpensive way to weigh the merits of various management options. However, the usefulness of simulation models relies on the accurate estimation of important model parameters, such as mortality. Concentration and time of exposure are both important in determining mortality in response to a toxic agent. Recent research indicated the existence of two resistance phenotypes in R. dominica in Australia, weak and strong, and revealed that the presence of resistance alleles at two loci confers strong resistance, thus motivating the construction of a two-locus model of resistance. Experimental data sets on purified pest strains, each corresponding to a single genotype of our two-locus model, were also available. Hence it became possible to explicitly include mortalities of the different genotypes in the model. In this paper we described how we used two generalized linear models (GLM), probit and logistic models, to fit the available experimental data sets. We used a direct algebraic approach generalized inverse matrix technique, rather than the traditional maximum likelihood estimation, to estimate the model parameters. The results show that both probit and logistic models fit the data sets well but the former is much better in terms of small least squares (numerical) errors. Meanwhile, the generalized inverse matrix technique achieved similar accuracy results to those from the maximum likelihood estimation, but is less time consuming and computationally demanding.

Asymmetric and Kind of Bracing Effects on Steel Frames Under Earthquake Loads

Because of architectural condition and structure application, sometimes mass source and stiffness source are not coincidence, and the structure is irregular. The structure is also might be asymmetric as an asymmetric bracing in plan which leads to unbalance distribution of stiffness or because of unbalance distribution of the mass. Both condition lead to eccentricity and torsion in the structure. The deficiency of ordinary code to evaluate the performance of steel structures against earthquake has been caused designing based on performance level or capacity spectrum be used. By using the mentioned methods it is possible to design a structure that its behavior against different earthquakes be predictive. In this article 5- story buildings with different percentage of asymmetric which is because of stiffness changes and kind of bracing (x and chevron bracing) have been designed. The static and dynamic nonlinear analysis under three acceleration recording has been done. Finally performance level of the structure has been evaluated.

Behavior of RC Buildings to Tsunami Action

The present report describes the characteristics of damages and behavior of reinforced concrete buildings during the tsunami action. The discussion is based on the field damage survey in selected cities located on the coast of the zone affected by the Great East Japan Earthquake on March 11, 2011. This earthquake is the most powerful know earthquake that has hit Japan with a magnitude 9.0 and with epicenter located at 129 km of Sendai city (off the coast). The earthquake triggered a destructive tsunami with run up height of up to 40 meters that mainly affect cities located on the Pacific Ocean coast of the Tohoku region (north-east region of Japan). Reinforced concrete buildings in general resist the tsunami without collapse however the non-structural elements like panels and ceilings were severely damaged. The analysis of damages has permitted to understand the behavior of RC buildings under tsunami attack, and has also permitted to establish recommendations for their use to take refuge from tsunami in places where natural topography makes impossible to reach hilltops or other safer places.

Romanian Single-parent Families: Quality of Life

The increasing divorce and fertility rates outside of marriage, the changing values in the last decades have led to a high prevalence of single parent families. Currently, worldwide, singleparent families represent about a quarter of all families. Recent changes occurring in the structure of single-parent families and also the multitude of factors that influence the quality of life of these families require the development of new research tools in order to provide foundations for social policies addressed to this type of family. The purpose of this paper is to present an analysis concerning the quality of life for single parent families in Romania, based on data collected through a research methodology developed by the authors within a scientific research project funded by a national grant called Partnerships in priority areas.

Automatic Musical Genre Classification Using Divergence and Average Information Measures

Recently many research has been conducted to retrieve pertinent parameters and adequate models for automatic music genre classification. In this paper, two measures based upon information theory concepts are investigated for mapping the features space to decision space. A Gaussian Mixture Model (GMM) is used as a baseline and reference system. Various strategies are proposed for training and testing sessions with matched or mismatched conditions, long training and long testing, long training and short testing. For all experiments, the file sections used for testing are never been used during training. With matched conditions all examined measures yield the best and similar scores (almost 100%). With mismatched conditions, the proposed measures yield better scores than the GMM baseline system, especially for the short testing case. It is also observed that the average discrimination information measure is most appropriate for music category classifications and on the other hand the divergence measure is more suitable for music subcategory classifications.

Tritium Determination in Danube River Water in Serbia by Liquid Scintillation Counter

Tritium activity concentration in Danube river water in Serbia has been determinate using a liquid scintillation counter Quantulus 1220. During December 2010, water samples were taken along the entire course of Danube through Serbia, from Hungarian- Serbian to Romanian-Serbian border. This investigation is very important because of the nearness of nuclear reactor Paks in Hungary. Sample preparation was performed by standard test method using Optiphase HiSafe 3 scintillation cocktail. We used a rapid method for the preparation of environmental samples, without electrolytic enrichment.

Surface Charge Based Rapid Method for Detection of Microbial Contamination in Drinking Water and Food Products

Microbial contamination, most of which are fecal born in drinking water and food industry is a serious threat to humans. Escherichia coli is one of the most common and prevalent among them. We have developed a sensor for rapid and an early detection of contaminants, taking E.coli as a threat indicator organism. The sensor is based on co-polymerizations of aniline and formaldehyde in form of thin film over glass surface using the vacuum deposition technique. The particular doping combination of thin film with Fe-Al and Fe-Cu in different concentrations changes its non conducting properties to p- type semi conductor. This property is exploited to detect the different contaminants, believed to have the different surface charge. It was found through experiments that different microbes at same OD (0.600 at 600 nm) have different conductivity in solution. Also the doping concentration is found to be specific for attracting microbes on the basis of surface charge. This is a simple, cost effective and quick detection method which not only decreases the measurement time but also gives early warnings for highly contaminated samples.

Optimizing Dialogue Strategy Learning Using Learning Automata

Modeling the behavior of the dialogue management in the design of a spoken dialogue system using statistical methodologies is currently a growing research area. This paper presents a work on developing an adaptive learning approach to optimize dialogue strategy. At the core of our system is a method formalizing dialogue management as a sequential decision making under uncertainty whose underlying probabilistic structure has a Markov Chain. Researchers have mostly focused on model-free algorithms for automating the design of dialogue management using machine learning techniques such as reinforcement learning. But in model-free algorithms there exist a dilemma in engaging the type of exploration versus exploitation. Hence we present a model-based online policy learning algorithm using interconnected learning automata for optimizing dialogue strategy. The proposed algorithm is capable of deriving an optimal policy that prescribes what action should be taken in various states of conversation so as to maximize the expected total reward to attain the goal and incorporates good exploration and exploitation in its updates to improve the naturalness of humancomputer interaction. We test the proposed approach using the most sophisticated evaluation framework PARADISE for accessing to the railway information system.

Analysis of a Spatiotemporal Phytoplankton Dynamics: Higher Order Stability and Pattern Formation

In this paper, for the understanding of the phytoplankton dynamics in marine ecosystem, a susceptible and an infected class of phytoplankton population is considered in spatiotemporal domain. Here, the susceptible phytoplankton is growing logistically and the growth of infected phytoplankton is due to the instantaneous Holling type-II infection response function. The dynamics are studied in terms of the local and global stabilities for the system and further explore the possibility of Hopf -bifurcation, taking the half saturation period as (i.e., ) the bifurcation parameter in temporal domain. It is also observe that the reaction diffusion system exhibits spatiotemporal chaos and pattern formation in phytoplankton dynamics, which is particularly important role play for the spatially extended phytoplankton system. Also the effect of the diffusion coefficient on the spatial system for both one and two dimensional case is obtained. Furthermore, we explore the higher-order stability analysis of the spatial phytoplankton system for both linear and no-linear system. Finally, few numerical simulations are carried out for pattern formation.

Evaluation of Tension Capacity of Pile (Case Study in Sandy Soil)

High building constructions are increasing in south beaches of the Caspian Sea because of tourist attractions and limitation of residential areas. According to saturated alluvial fields transfer of load from high structures to the soil by piles is inevitable. In spite of most of these piles are under compression forces, tension piles are used in special conditions. Few studies have been conducted because of the limited use of these piles. Tension capacity of openended pipe piles in full scale was tested in this study. The length of the bored piles was 420 up to 480 cm and all were in 120 cm diameter. The results of testing 7 piles were compared with the results of relations given by researches.

Design of Multi-disease Diagnosis Processor using Hypernetworks Technique

In this paper, we propose disease diagnosis hardware architecture by using Hypernetworks technique. It can be used to diagnose 3 different diseases (SPECT Heart, Leukemia, Prostate cancer). Generally, the disparate diseases require specified diagnosis hardware model for each disease. Using similarities of three diseases diagnosis processor, we design diagnosis processor that can diagnose three different diseases. Our proposed architecture that is combining three processors to one processor can reduce hardware size without decrease of the accuracy.

A Bayesian Network Reliability Modeling for FlexRay Systems

The increasing importance of FlexRay systems in automotive domain inspires unceasingly relative researches. One primary issue among researches is to verify the reliability of FlexRay systems either from protocol aspect or from system design aspect. However, research rarely discusses the effect of network topology on the system reliability. In this paper, we will illustrate how to model the reliability of FlexRay systems with various network topologies by a well-known probabilistic reasoning technology, Bayesian Network. In this illustration, we especially investigate the effectiveness of error containment built in star topology and fault-tolerant midpoint synchronization algorithm adopted in FlexRay communication protocol. Through a FlexRay steer-by-wire case study, the influence of different topologies on the failure probability of the FlexRay steerby- wire system is demonstrated. The notable value of this research is to show that the Bayesian Network inference is a powerful and feasible method for the reliability assessment of FlexRay systems.

A Case Study on Product Development Performance Measurement

In recent years, an increased competition and lower profit margins have necessitated a focus on improving the performance of the product development process, an area that traditionally have been excluded from detailed steering and evaluation. A systematic improvement requires a good understanding of the current performance, wherefore the interest for product development performance measurement has increased dramatically. This paper presents a case study that evaluates the performance of the product development performance measurement system used in a Swedish company that is a part of a global corporate group. The study is based on internal documentation and eighteen in-depth interviews with stakeholders involved in the product development process. The results from the case study includes a description of what metrics that are in use, how these are employed, and its affect on the quality of the performance measurement system. Especially, the importance of having a well-defined process proved to have a major impact on the quality of the performance measurement system in this particular case.

Spread Spectrum Code Estimationby Particle Swarm Algorithm

In the context of spectrum surveillance, a new method to recover the code of spread spectrum signal is presented, while the receiver has no knowledge of the transmitter-s spreading sequence. In our previous paper, we used Genetic algorithm (GA), to recover spreading code. Although genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems, but nonetheless, by increasing the length of the code, we will often lead to an unacceptable slow convergence speed. To solve this problem we introduce Particle Swarm Optimization (PSO) into code estimation in spread spectrum communication system. In searching process for code estimation, the PSO algorithm has the merits of rapid convergence to the global optimum, without being trapped in local suboptimum, and good robustness to noise. In this paper we describe how to implement PSO as a component of a searching algorithm in code estimation. Swarm intelligence boasts a number of advantages due to the use of mobile agents. Some of them are: Scalability, Fault tolerance, Adaptation, Speed, Modularity, Autonomy, and Parallelism. These properties make swarm intelligence very attractive for spread spectrum code estimation. They also make swarm intelligence suitable for a variety of other kinds of channels. Our results compare between swarm-based algorithms and Genetic algorithms, and also show PSO algorithm performance in code estimation process.