Study the Efficacies of Green Manure Application as Chickpea Pre Plant

In order to Study the efficacy application of green manure as chickpea pre plant, field experiments were carried out in 2007 and 2008 growing seasons. In this research the effects of different strategies for soil fertilization were investigated on grain yield and yield component, minerals, organic compounds and cooking time of chickpea. Experimental units were arranged in splitsplit plots based on randomized complete blocks with three replications. Main plots consisted of (G1): establishing a mixed vegetation of Vicia panunica and Hordeum vulgare and (G2): control, as green manure levels. Also, five strategies for obtaining the base fertilizer requirement including (N1): 20 t.ha-1 farmyard manure; (N2): 10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate; (N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1 triple super phosphate were considered in sub plots. Furthermoree four levels of biofertilizers consisted of (B1): Bacillus lentus + Pseudomonas putida; (B2): Trichoderma harzianum; (B3): Bacillus lentus + Pseudomonas putida + Trichoderma harzianum; and (B4): control (without biofertilizers) were arranged in sub-sub plots. Results showed that integrating biofertilizers (B3) and green manure (G1) produced the highest grain yield. The highest amounts of yield were obtained in G1×N5 interaction. Comparison of all 2-way and 3-way interactions showed that G1N5B3 was determined as the superior treatment. Significant increasing of N, P2O5, K2O, Fe and Mg content in leaves and grains emphasized on superiority of mentioned treatment because each one of these nutrients has an approved role in chlorophyll synthesis and photosynthesis abilities of the crops. The combined application of compost, farmyard manure and chemical phosphorus (N5) in addition to having the highest yield, had the best grain quality due to high protein, starch and total sugar contents, low crude fiber and reduced cooking time.

Fast Document Segmentation Using Contourand X-Y Cut Technique

This paper describes fast and efficient method for page segmentation of document containing nonrectangular block. The segmentation is based on edge following algorithm using small window of 16 by 32 pixels. This segmentation is very fast since only border pixels of paragraph are used without scanning the whole page. Still, the segmentation may contain error if the space between them is smaller than the window used in edge following. Consequently, this paper reduce this error by first identify the missed segmentation point using direction information in edge following then, using X-Y cut at the missed segmentation point to separate the connected columns. The advantage of the proposed method is the fast identification of missed segmentation point. This methodology is faster with fewer overheads than other algorithms that need to access much more pixel of a document.

Sensor Network Based Emergency Response and Navigation Support Architecture

In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment. 

Faster FPGA Routing Solution using DNA Computing

There are many classical algorithms for finding routing in FPGA. But Using DNA computing we can solve the routes efficiently and fast. The run time complexity of DNA algorithms is much less than other classical algorithms which are used for solving routing in FPGA. The research in DNA computing is in a primary level. High information density of DNA molecules and massive parallelism involved in the DNA reactions make DNA computing a powerful tool. It has been proved by many research accomplishments that any procedure that can be programmed in a silicon computer can be realized as a DNA computing procedure. In this paper we have proposed two tier approaches for the FPGA routing solution. First, geometric FPGA detailed routing task is solved by transforming it into a Boolean satisfiability equation with the property that any assignment of input variables that satisfies the equation specifies a valid routing. Satisfying assignment for particular route will result in a valid routing and absence of a satisfying assignment implies that the layout is un-routable. In second step, DNA search algorithm is applied on this Boolean equation for solving routing alternatives utilizing the properties of DNA computation. The simulated results are satisfactory and give the indication of applicability of DNA computing for solving the FPGA Routing problem.

Intelligent Neural Network Based STLF

Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.

Performance Evaluation of the OCDM/WDM Technique for Optical Packet Switches

The performance of the Optical Code Division Multiplexing/ Wavelength Division Multiplexing (WDM/OCDM) technique for Optical Packet Switch is investigated. The impact on the performance of the impairment due to both Multiple Access Interference and Beat noise is studied. The Packet Loss Probability due to output packet contentions is evaluated as a function of the main switch and traffic parameters when Gold coherent optical codes are adopted. The Packet Loss Probability of the OCDM/WDM switch can reach 10-9 when M=16 wavelengths, Gold code of length L=511 and only 24 wavelength converters are used in the switch.

Eco-Roof Systems in Subtropical Climates for Sustainable Development and Mitigation of Climate Change

The benefits of eco-roofs is quite well known, however there remains very little research conducted for the implementation of eco-roofs in subtropical climates such as Australia. There are many challenges facing Australia as it moves into the future, climate change is proving to be one of the leading challenges. In order to move forward with the mitigation of climate change, the impacts of rapid urbanization need to be offset. Eco-roofs are one way to achieve this; this study presents the energy savings and environmental benefits of the implementation of eco-roofs in subtropical climates. An experimental set-up was installed at Rockhampton campus of Central Queensland University, where two shipping containers were converted into small offices, one with an eco-roof and one without. These were used for temperature, humidity and energy consumption data collection. In addition, a computational model was developed using Design Builder software (state-of-the-art building energy simulation software) for simulating energy consumption of shipping containers and environmental parameters, this was done to allow comparison between simulated and real world data. This study found that eco-roofs are very effective in subtropical climates and provide energy saving of about 13% which agrees well with simulated results. 

Formation and Evaluation of Lahar/HDPE Hybrid Composite as a Structural Material for Household Biogas Digester

This study was an investigation on the suitability of Lahar/HDPE composite as a primary material for low-cost smallscale biogas digesters. While sources of raw materials for biogas are abundant in the Philippines, cost of the technology has made the widespread utilization of this resource an indefinite proposition. Aside from capital economics, another problem arises with space requirements of current digester designs. These problems may be simultaneously addressed by fabricating digesters on a smaller, household scale to reach a wider market, and to use materials that may accommodate optimization of overall design and fabrication cost without sacrificing operational efficiency. This study involved actual fabrication of the Lahar/HDPE composite at varying composition and geometry, subsequent mechanical and thermal characterization, and implementation of Statistical Analysis to find intrinsic relationships between variables. From the results, Lahar/HDPE composite was found to be feasible for use as digester material from both mechanical and economic standpoints. 

Feature Preserving Nonlinear Diffusion for Ultrasonic Image Denoising and Edge Enhancement

Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.

Grid Computing in Physics and Life Sciences

Certain sciences such as physics, chemistry or biology, have a strong computational aspect and use computing infrastructures to advance their scientific goals. Often, high performance and/or high throughput computing infrastructures such as clusters and computational Grids are applied to satisfy computational needs. In addition, these sciences are sometimes characterised by scientific collaborations requiring resource sharing which is typically provided by Grid approaches. In this article, I discuss Grid computing approaches in High Energy Physics as well as in bioinformatics and highlight some of my experience in both scientific domains.

Traffic Signal Design and Simulation for Vulnerable Road Users Safety and Bus Preemption

Mostly, pedestrian-car accidents occurred at a signalized interaction is because pedestrians cannot across the intersection safely within the green light. From the viewpoint of pedestrian, there might have two reasons. The first one is pedestrians cannot speed up to across the intersection, such as the elders. The other reason is pedestrians do not sense that the signal phase is going to change and their right-of-way is going to lose. Developing signal logic to protect pedestrian, who is crossing an intersection is the first purpose of this study. Another purpose of this study is improving the reliability and reduce delay of public transportation service. Therefore, bus preemption is also considered in the designed signal logic. In this study, the traffic data of the intersection of Chong-Qing North Road and Min-Zu West Road, Taipei, Taiwan, is employed to calibrate and validate the signal logic by simulation. VISSIM 5.20, which is a microscopic traffic simulation software, is employed to simulate the signal logic. From the simulated results, the signal logic presented in this study can protect pedestrians crossing the intersection successfully. The design of bus preemption can reduce the average delay. However, the pedestrian safety and bus preemptive signal will influence the average delay of cars largely. Thus, whether applying the pedestrian safety and bus preemption signal logic to an isolated intersection or not should be evaluated carefully.

A Hybrid Classification Method using Artificial Neural Network Based Decision Tree for Automatic Sleep Scoring

In this paper we propose a new classification method for automatic sleep scoring using an artificial neural network based decision tree. It attempts to treat sleep scoring progress as a series of two-class problems and solves them with a decision tree made up of a group of neural network classifiers, each of which uses a special feature set and is aimed at only one specific sleep stage in order to maximize the classification effect. A single electroencephalogram (EEG) signal is used for our analysis rather than depending on multiple biological signals, which makes greatly simplifies the data acquisition process. Experimental results demonstrate that the average epoch by epoch agreement between the visual and the proposed method in separating 30s wakefulness+S1, REM, S2 and SWS epochs was 88.83%. This study shows that the proposed method performed well in all the four stages, and can effectively limit error propagation at the same time. It could, therefore, be an efficient method for automatic sleep scoring. Additionally, since it requires only a small volume of data it could be suited to pervasive applications.

A Fast Block-based Evolutional Algorithm for Combinatorial Problems

The problems with high complexity had been the challenge in combinatorial problems. Due to the none-determined and polynomial characteristics, these problems usually face to unreasonable searching budget. Hence combinatorial optimizations attracted numerous researchers to develop better algorithms. In recent academic researches, most focus on developing to enhance the conventional evolutional algorithms and facilitate the local heuristics, such as VNS, 2-opt and 3-opt. Despite the performances of the introduction of the local strategies are significant, however, these improvement cannot improve the performance for solving the different problems. Therefore, this research proposes a meta-heuristic evolutional algorithm which can be applied to solve several types of problems. The performance validates BBEA has the ability to solve the problems even without the design of local strategies.

Entanglement-based Quantum Computing by Diagrams of States

We explore entanglement in composite quantum systems and how its peculiar properties are exploited in quantum information and communication protocols by means of Diagrams of States, a novel method to graphically represent and analyze how quantum information is elaborated during computations performed by quantum circuits. We present quantum diagrams of states for Bell states generation, measurements and projections, for dense coding and quantum teleportation, for probabilistic quantum machines designed to perform approximate quantum cloning and universal NOT and, finally, for quantum privacy amplification based on entanglement purification. Diagrams of states prove to be a useful approach to analyze quantum computations, by offering an intuitive graphic representation of the processing of quantum information. They also help in conceiving novel quantum computations, from describing the desired information processing to deriving the final implementation by quantum gate arrays.

A Fully Parallel Reverse Converter

The residue number system (RNS) is popular in high performance computation applications because of its carry-free nature. The challenges of RNS systems design lie in the moduli set selection and in the reverse conversion from residue representation to weighted representation. In this paper, we proposed a fully parallel reverse conversion algorithm for the moduli set {rn - 2, rn - 1, rn}, based on simple mathematical relationships. Also an efficient hardware realization of this algorithm is presented. Our proposed converter is very faster and results to hardware savings, compared to the other reverse converters.

On the Parameter Optimization of Fuzzy Inference Systems

Nowadays, more engineering systems are using some kind of Artificial Intelligence (AI) for the development of their processes. Some well-known AI techniques include artificial neural nets, fuzzy inference systems, and neuro-fuzzy inference systems among others. Furthermore, many decision-making applications base their intelligent processes on Fuzzy Logic; due to the Fuzzy Inference Systems (FIS) capability to deal with problems that are based on user knowledge and experience. Also, knowing that users have a wide variety of distinctiveness, and generally, provide uncertain data, this information can be used and properly processed by a FIS. To properly consider uncertainty and inexact system input values, FIS normally use Membership Functions (MF) that represent a degree of user satisfaction on certain conditions and/or constraints. In order to define the parameters of the MFs, the knowledge from experts in the field is very important. This knowledge defines the MF shape to process the user inputs and through fuzzy reasoning and inference mechanisms, the FIS can provide an “appropriate" output. However an important issue immediately arises: How can it be assured that the obtained output is the optimum solution? How can it be guaranteed that each MF has an optimum shape? A viable solution to these questions is through the MFs parameter optimization. In this Paper a novel parameter optimization process is presented. The process for FIS parameter optimization consists of the five simple steps that can be easily realized off-line. Here the proposed process of FIS parameter optimization it is demonstrated by its implementation on an Intelligent Interface section dealing with the on-line customization / personalization of internet portals applied to E-commerce.

Assessment of Resistance of Wheat Genotypes (T. aestivum and T. durum) To Boron Toxicity

Research on the boron (B) toxicity problems had recently considerable relation, especially in the dry regions of the world. Development of resistant varieties to B toxicity is a high priority on these regions, where the soils have high levels of B. Thus, this study aimed to assessment the resistance of wheat genotypes to B toxicity using the agronomic and physiologic parameters. For this aim, a pot experiment, based on a completely randomized design with three replications, was conducted using the soil of calcareous usthochrepts. In the study, twenty different wheat genotypes of T. aestivum and T. Durum were used. Boron fertilizer at the levels of 0 (-B), 30 mg B kg-1 (+B) as H3BO3 was applied to the pots. After harvest, plant dry matter yield was recorded, and total B concentrations in tops of wheat plants were determined. The results have revealed the existence of a large genotypic variation among wheat genotypes to their physiologic and agronomic susceptibility to B toxicity.

Robust Stability in Multivariable Neural Network Control using Harmonic Analysis

Robust stability and performance are the two most basic features of feedback control systems. The harmonic balance analysis technique enables to analyze the stability of limit cycles arising from a neural network control based system operating over nonlinear plants. In this work a robust stability analysis based on the harmonic balance is presented and applied to a neural based control of a non-linear binary distillation column with unstructured uncertainty. We develop ways to describe uncertainty in the form of neglected nonlinear dynamics and high harmonics for the plant and controller respectively. Finally, conclusions about the performance of the neural control system are discussed using the Nyquist stability margin together with the structured singular values of the uncertainty as a robustness measure.

Analysis of the Visual Preference of Patterns in Pedestrian Roads

The purpose of this study is to analyze the visual preference of patterns in pedestrian roads. In this study, animation was applied for the estimation of dynamic streetscape. Six patterns of pedestrian were selected in order to analyze the visual preference. The shapes are straight, s-curve, and zigzag. The ratio of building's height and road's width are 2:1 and 1:1. Twelve adjective pairs used in the field investigation were selected from adjectives which are used usually in the estimation of streetscape. They are interesting-boring, simple-complex, calm-noisy, open-enclosed, active-inactive, lightly-depressing, regular-irregular, unique-usual, rhythmic-not rhythmic, united-not united, stable-unstable, tidy-untidy. Dynamic streetscape must be considered important in pedestrian shopping mall and park because it will be an attraction. So, s-curve pedestrian road, which is the most beautiful as a result of this study, should be designed in this area. Also, the ratio of building's height and road's width along pedestrian road should be reduced.

Study on the Production of Chromite Refractory Brick from Local Chromite Ore

Chromite is one of the principal ore of chromium in which the metal exists as a complex oxide (FeO.Cr2O3).The prepared chromite can be widely used as refractory in high temperature applications. This study describes the use of local chromite ore as refractory material. To study the feasibility of local chromite, chemical analysis and refractoriness are firstly measured. To produce chromite refractory brick, it is pressed under a press of 400 tons, dried and fired at 1580°C for fifty two hours. Then, the standard properties such as cold crushing strength, apparent porosity, apparent specific gravity, bulk density and water absorption that the chromite brick should possess were measured. According to the results obtained, the brick made by local chromite ore was suitable for use as refractory brick.