Effect of Nano-Silver on Growth of Saffron in Flooding Stress

Saffron (Crocus sativus) is cultivated as spices, medicinal and aromatic plant species. At autumn season, heavy rainfall can cause flooding stress and inhibits growth of saffron. Thus this research was conducted to study the effect of silver ion (as an ethylene inhibitor) on growth of saffron under flooding conditions. The corms of saffron were soaked with one concentration of nano silver (0, 40, 80 or 120 ppm) and then planting under flooding stress or non flooding stress conditions. Results showed that number of roots, root length, root fresh and dry weight, leaves fresh and dry weight were reduced by 10 day flooding stress. Soaking saffron corms with 40 or 80 ppm concentration of nano silver rewarded the effect of flooding stress on the root number, by increasing it. Furthermore, 40 ppm of nano silver increased root length in stress. Nano silver 80 ppm in flooding stress, increased leaves dry weight.

Space Time Processing with Adaptive STBC-OFDM Systems

In this paper, Optimum adaptive loading algorithms are applied to multicarrier system with Space-Time Block Coding (STBC) scheme associated with space-time processing based on singular-value decomposition (SVD) of the channel matrix over Rayleigh fading channels. SVD method has been employed in MIMO-OFDM system in order to overcome subchannel interference. Chaw-s and Compello-s algorithms have been implemented to obtain a bit and power allocation for each subcarrier assuming instantaneous channel knowledge. The adaptive loaded SVD-STBC scheme is capable of providing both full-rate and full-diversity for any number of transmit antennas. The effectiveness of these techniques has demonstrated through the simulation of an Adaptive loaded SVDSTBC system, and the comparison shown that the proposed algorithms ensure better performance in the case of MIMO.

Grid-based Supervised Clustering - GBSC

This paper presents a supervised clustering algorithm, namely Grid-Based Supervised Clustering (GBSC), which is able to identify clusters of any shapes and sizes without presuming any canonical form for data distribution. The GBSC needs no prespecified number of clusters, is insensitive to the order of the input data objects, and is capable of handling outliers. Built on the combination of grid-based clustering and density-based clustering, under the assistance of the downward closure property of density used in bottom-up subspace clustering, the GBSC can notably reduce its search space to avoid the memory confinement situation during its execution. On two-dimension synthetic datasets, the GBSC can identify clusters with different shapes and sizes correctly. The GBSC also outperforms other five supervised clustering algorithms when the experiments are performed on some UCI datasets.

Reliability Analysis in Electrical Distribution System Considering Preventive Maintenance Applications on Circuit Breakers

This paper presents the results of a preventive maintenance application-based study and modeling of failure rates in breakers of electrical distribution systems. This is a critical issue in the reliability assessment of a system. In the analysis conducted in this paper, the impacts of failure rate variations caused by a preventive maintenance are examined. This is considered as a part of a Reliability Centered Maintenance (RCM) application program. A number of load point reliability indices is derived using the mathematical model of the failure rate, which is established using the observed data in a distribution system.

Debye Layer Confinement of Nucleons in Nuclei by Laser Ablated Plasma

Following the laser ablation studies leading to a theory of nuclei confinement by a Debye layer mechanism, we present here numerical evaluations for the known stable nuclei where the Coulomb repulsion is included as a rather minor component especially for lager nuclei. In this research paper the required physical conditions for the formation and stability of nuclei particularly endothermic nuclei with mass number greater than to which is an open astrophysical question have been investigated. Using the Debye layer mechanism, nuclear surface energy, Fermi energy and coulomb repulsion energy it is possible to find conditions under which the process of nucleation is permitted in early universe. Our numerical calculations indicate that about 200 second after the big bang at temperature of about 100 KeV and subrelativistic region with nucleon density nearly equal to normal nuclear density namely, 10cm all endothermic and exothermic nuclei have been formed.

High-Intensity Nanosecond Pulsed Electric Field effects on Early Physiological Development in Arabidopsis thaliana

The influences of pulsed electric fields on early physiological development in Arabidopsis thaliana were studied. Inside a 4-mm electroporation cuvette, pre-germination seeds were subjected to high-intensity, nanosecond electrical pulses generated using laboratory-assembled pulsed electric field system. The field strength was varied from 5 to 20 kV.cm-1 and the pulse width and the pulse number were maintained at 10 ns and 100, respectively, corresponding to the specific treatment energy from 300 J.kg-1 to 4.5 kJ.kg-1. Statistical analyses on the average leaf area 5 and 15 days following pulsed electric field treatment showed that the effects appear significant the second week after treatments with a maximum increase of 80% compared to the control (P < 0.01).

Applying Gibbs Sampler for Multivariate Hierarchical Linear Model

Among various HLM techniques, the Multivariate Hierarchical Linear Model (MHLM) is desirable to use, particularly when multivariate criterion variables are collected and the covariance structure has information valuable for data analysis. In order to reflect prior information or to obtain stable results when the sample size and the number of groups are not sufficiently large, the Bayes method has often been employed in hierarchical data analysis. In these cases, although the Markov Chain Monte Carlo (MCMC) method is a rather powerful tool for parameter estimation, Procedures regarding MCMC have not been formulated for MHLM. For this reason, this research presents concrete procedures for parameter estimation through the use of the Gibbs samplers. Lastly, several future topics for the use of MCMC approach for HLM is discussed.

Turbulent Forced Convection Flow in a Channel over Periodic Grooves Using Nanofluids

Turbulent forced convection flow in a 2-dimensional channel over periodic grooves is numerically investigated. Finite volume method is used to study the effect of turbulence model. The range of Reynolds number varied from 10000 to 30000 for the ribheight to channel-height ratio (B/H) of 2. The downstream wall is heated by a uniform heat flux while the upstream wall is insulated. The investigation is analyzed with different types of nanoparticles such as SiO2, Al2O3, and ZnO, with water as a base fluid are used. The volume fraction is varied from 1% to 4% and the nanoparticle diameter is utilized between 20nm to 50nm. The results revealed 114% heat transfer enhancement compared to the water in a grooved channel by using SiO2 nanoparticle with volume fraction and nanoparticle diameter of 4% and 20nm respectively.

An Experimental Investigation on the Effect of Deep cold Rolling Parameters on Surface Roughness and Hardness of AISI 4140 Steel

Deep cold rolling (DCR) is a cold working process, which easily produces a smooth and work-hardened surface by plastic deformation of surface irregularities. In the present study, the influence of main deep cold rolling process parameters on the surface roughness and the hardness of AISI 4140 steel were studied by using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in terms of identifying the predominant factor amongst the selected parameters, their order of significance and setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. It was found that the ball diameter, rolling force, initial surface roughness and number of tool passes are the most pronounced parameters, which have great effects on the work piece-s surface during the deep cold rolling process. A simple, inexpensive and newly developed DCR tool, with interchangeable collet for using different ball diameters, was used throughout the experimental work presented in this paper.

Grid Independence Study of Flow Past a Square Cylinder Using the Multi-Relaxation-Time Lattice Boltzmann Method

Numerical calculations of flow around a square cylinder are presented using the multi-relaxation-time lattice Boltzmann method at Reynolds number 150. The effects of upstream locations, downstream locations and blockage are investigated systematically. A detail analysis are given in terms of time-trace analysis of drag and lift coefficients, power spectra analysis of lift coefficient, vorticity contours visualizations and phase diagrams. A number of physical quantities mean drag coefficient, drag coefficient, Strouhal number and root-mean-square values of drag and lift coefficients are calculated and compared with the well resolved experimental data and numerical results available in open literature. The results had shown that the upstream, downstream and height of the computational domain are at least 7.5, 37.5 and 12 diameters of the cylinder, respectively.

Reducing the Number of Constraints in Non Safe Petri Net

This paper addresses the problem of forbidden states in non safe Petri Nets. In the system, for preventing it from entering the forbidden states, some linear constraints can be assigned to them. Then these constraints can be enforced on the system using control places. But when the number of constraints in the system is large, a large number of control places must be added to the model of system. This concept complicates the model of system. There are some methods for reducing the number of constraints in safe Petri Nets. But there is no a systematic method for non safe Petri Nets. In this paper we propose a method for reducing the number of constraints in non safe Petri Nets which is based on solving an integer linear programming problem.

The Traffic Prediction Multi-path Energy-aware Source Routing (TP-MESR)in Ad hoc Networks

The purpose of this study is to suggest energy efficient routing for ad hoc networks which are composed of nodes with limited energy. There are diverse problems including limitation of energy supply of node, and the node energy management problem has been presented. And a number of protocols have been proposed for energy conservation and energy efficiency. In this study, the critical point of the EA-MPDSR, that is the type of energy efficient routing using only two paths, is improved and developed. The proposed TP-MESR uses multi-path routing technique and traffic prediction function to increase number of path more than 2. It also verifies its efficiency compared to EA-MPDSR using network simulator (NS-2). Also, To give a academic value and explain protocol systematically, research guidelines which the Hevner(2004) suggests are applied. This proposed TP-MESR solved the existing multi-path routing problem related to overhead, radio interference, packet reassembly and it confirmed its contribution to effective use of energy in ad hoc networks.

Investigating Quality Metrics for Multimedia Traffic in OLSR Routing Protocol

An Ad hoc wireless network comprises of mobile terminals linked and communicating with each other sans the aid of traditional infrastructure. Optimized Link State Protocol (OLSR) is a proactive routing protocol, in which routes are discovered/updated continuously so that they are available when needed. Hello messages generated by a node seeks information about its neighbor and if the latter fails to respond to a specified number of hello messages regulated by neighborhood hold time, the node is forced to assume that the neighbor is not in range. This paper proposes to evaluate OLSR routing protocol in a random mobility network having various neighborhood hold time intervals. The throughput and delivery ratio are also evaluated to learn about its efficiency for multimedia loads.

Methodology of Realization for Supervisor and Simulator Dedicated to a Semiconductor Research and Production Factory

In the micro and nano-technology industry, the «clean-rooms» dedicated to manufacturing chip, are equipped with the most sophisticated equipment-tools. There use a large number of resources in according to strict specifications for an optimum working and result. The distribution of «utilities» to the production is assured by teams who use a supervision tool. The studies show the interest to control the various parameters of production or/and distribution, in real time, through a reliable and effective supervision tool. This document looks at a large part of the functions that the supervisor must assure, with complementary functionalities to help the diagnosis and simulation that prove very useful in our case where the supervised installations are complexed and in constant evolution.

Practical Issues for Real-Time Video Tracking

In this paper we present the algorithm which allows us to have an object tracking close to real time in Full HD videos. The frame rate (FR) of a video stream is considered to be between 5 and 30 frames per second. The real time track building will be achieved if the algorithm can follow 5 or more frames per second. The principle idea is to use fast algorithms when doing preprocessing to obtain the key points and track them after. The procedure of matching points during assignment is hardly dependent on the number of points. Because of this we have to limit pointed number of points using the most informative of them.

e-Learning Program with Voice Assistance for a Tactile Braille

Along with the increased morbidity of glaucoma or diabetic retinitis pigmentosa, etc., number of people with vision loss is also increasing in Japan. It is difficult for the visually impaired to learn and acquire braille because most of them are middle-aged. In addition, number of braille teachers are not sufficient and reducing in Japan, and this situation makes more difficult for the visually impaired. Therefore, we research and develop a Web-based e-learning program for tactile braille, that cooperate with braille display and voice assistance.

School Design and Energy Efficiency

Auckland has a temperate climate with comfortable warm, dry summers and mild, wet winters. An Auckland school normally does not need air conditioning for cooling during the summer and only need heating during the winter. The space hating energy is the major portion of winter school energy consumption and the winter energy consumption is major portion of annual school energy consumption. School building thermal design should focus on the winter thermal performance for reducing the space heating energy. A number of Auckland schools- design data and energy consumption data are used for this study. This pilot study investigates the relationships between their energy consumption data and school building design data to improve future school design for energy efficiency.

Information Gain Ratio Based Clustering for Investigation of Environmental Parameters Effects on Human Mental Performance

Methods of clustering which were developed in the data mining theory can be successfully applied to the investigation of different kinds of dependencies between the conditions of environment and human activities. It is known, that environmental parameters such as temperature, relative humidity, atmospheric pressure and illumination have significant effects on the human mental performance. To investigate these parameters effect, data mining technique of clustering using entropy and Information Gain Ratio (IGR) K(Y/X) = (H(X)–H(Y/X))/H(Y) is used, where H(Y)=-ΣPi ln(Pi). This technique allows adjusting the boundaries of clusters. It is shown that the information gain ratio (IGR) grows monotonically and simultaneously with degree of connectivity between two variables. This approach has some preferences if compared, for example, with correlation analysis due to relatively smaller sensitivity to shape of functional dependencies. Variant of an algorithm to implement the proposed method with some analysis of above problem of environmental effects is also presented. It was shown that proposed method converges with finite number of steps.

Numerical Investigation of the Effect of Flow and Heat Transfer of a Semi-Cylindrical Obstacle Located in a Channel

In this study, a semi-cylinder obstacle placed in a channel is handled to determine the effect of flow and heat transfer around the obstacle. Both faces of the semi-cylinder are used in the numerical analysis. First, the front face of the semi-cylinder is stated perpendicular to flow, than the rear face is placed. The study is carried out numerically, by using commercial software ANSYS 11.0. The well-known κ-ε model is applied as the turbulence model. Reynolds number is in the range of 104 to 105 and air is assumed as the flowing fluid. The results showed that, heat transfer increased approximately 15 % in the front faze case, while it enhanced up to 28 % in the rear face case.

Multiple Job Shop-Scheduling using Hybrid Heuristic Algorithm

In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.