A Fast Sensor Relocation Algorithm in Wireless Sensor Networks

Sensor relocation is to repair coverage holes caused by node failures. One way to repair coverage holes is to find redundant nodes to replace faulty nodes. Most researches took a long time to find redundant nodes since they randomly scattered redundant nodes around the sensing field. To record the precise position of sensor nodes, most researches assumed that GPS was installed in sensor nodes. However, high costs and power-consumptions of GPS are heavy burdens for sensor nodes. Thus, we propose a fast sensor relocation algorithm to arrange redundant nodes to form redundant walls without GPS. Redundant walls are constructed in the position where the average distance to each sensor node is the shortest. Redundant walls can guide sensor nodes to find redundant nodes in the minimum time. Simulation results show that our algorithm can find the proper redundant node in the minimum time and reduce the relocation time with low message complexity.

Effects of Aggressive Ammonium Nitrate on Durability Properties of Concrete Using Sandstone and Granite Aggregates

The storage of chemical fertilizers in concrete building often leads to durability problems due to chemical attack. The damage of concrete is mostly caused by certain ammonium salts. The main purpose of the research is to investigate the durability properties of concrete being exposed to ammonium nitrate solution. In this investigation, experiments are conducted on concrete type G50 and G60. The leaching process is achieved by the use of 20% concentration solution of ammonium nitrate. The durability properties investigated are water absorption, volume of permeable voids, and sorptivity. Compressive strength, pH value, and degradation depth are measured after a certain period of leaching. A decrease in compressive strength and an increase in porosity are found through the conducted experiments. Apart from that, the experimental data shows that pH value decreases with increased leaching time while the degradation depth of concrete increases with leaching time. By comparing concrete type G50 and G60, concrete type G60 is more resistant to ammonium nitrate attack.

Computational Aspects of Regression Analysis of Interval Data

We consider linear regression models where both input data (the values of independent variables) and output data (the observations of the dependent variable) are interval-censored. We introduce a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. This set captures the impact of the loss of information on the OLS estimator caused by interval censoring and provides a tool for quantification of this effect. We study complexity-theoretic properties of the OLS-set. We also deal with restricted versions of the general interval linear regression model, in particular the crisp input – interval output model. We give an argument that natural descriptions of the OLS-set in the crisp input – interval output cannot be computed in polynomial time. Then we derive easily computable approximations for the OLS-set which can be used instead of the exact description. We illustrate the approach by an example.

An Energy-Efficient Distributed Unequal Clustering Protocol for Wireless Sensor Networks

The wireless sensor networks have been extensively deployed and researched. One of the major issues in wireless sensor networks is a developing energy-efficient clustering protocol. Clustering algorithm provides an effective way to prolong the lifetime of a wireless sensor networks. In the paper, we compare several clustering protocols which significantly affect a balancing of energy consumption. And we propose an Energy-Efficient Distributed Unequal Clustering (EEDUC) algorithm which provides a new way of creating distributed clusters. In EEDUC, each sensor node sets the waiting time. This waiting time is considered as a function of residual energy, number of neighborhood nodes. EEDUC uses waiting time to distribute cluster heads. We also propose an unequal clustering mechanism to solve the hot-spot problem. Simulation results show that EEDUC distributes the cluster heads, balances the energy consumption well among the cluster heads and increases the network lifetime.

Formulation, Analysis and Validation of Takagi-Sugeno Fuzzy Modeling For Robotic Monipulators

This paper proposes a methodology for analysis of the dynamic behavior of a robotic manipulator in continuous time. Initially this system (nonlinear system) will be decomposed into linear submodels and analyzed in the context of the Linear and Parameter Varying (LPV) Systems. The obtained linear submodels, which represent the local dynamic behavior of the robotic manipulator in some operating points were grouped in a Takagi-Sugeno fuzzy structure. The obtained fuzzy model was analyzed and validated through analog simulation, as universal approximator of the robotic manipulator.

Using Memetic Algorithms for the Solution of Technical Problems

The intention of this paper is, to help the user of evolutionary algorithms to adapt them easier to their problem at hand. For a lot of problems in the technical field it is not necessary to reach an optimum solution, but to reach a good solution in time. In many cases the solution is undetermined or there doesn-t exist a method to determine the solution. For these cases an evolutionary algorithm can be useful. This paper intents to give the user rules of thumb with which it is easier to decide if the problem is suitable for an evolutionary algorithm and how to design them.

Fifth Order Variable Step Block Backward Differentiation Formulae for Solving Stiff ODEs

The implicit block methods based on the backward differentiation formulae (BDF) for the solution of stiff initial value problems (IVPs) using variable step size is derived. We construct a variable step size block methods which will store all the coefficients of the method with a simplified strategy in controlling the step size with the intention of optimizing the performance in terms of precision and computation time. The strategy involves constant, halving or increasing the step size by 1.9 times the previous step size. Decision of changing the step size is determined by the local truncation error (LTE). Numerical results are provided to support the enhancement of method applied.

Characterization of Corn Cobs from Microwave and Potassium Hydroxide Pretreatment

The complexity of lignocellulosic biomass requires a pretreatment step to improve the yield of fermentable sugars. The efficient pretreatment of corn cobs using microwave and potassium hydroxide and enzymatic hydrolysis was investigated. The objective of this work was to characterize the optimal condition of pretreatment of corn cobs using microwave and potassium hydroxide enhance enzymatic hydrolysis. Corn cobs were submerged in different potassium hydroxide concentration at varies temperature and resident time. The pretreated corn cobs were hydrolyzed to produce the reducing sugar for analysis. The morphology and microstructure of samples were investigated by Thermal gravimetric analysis (TGA, scanning electron microscope (SEM), X-ray diffraction (XRD). The results showed that lignin and hemicellulose were removed by microwave/potassium hydroxide pretreatment. The crystallinity of the pretreated corn cobs was higher than the untreated. This method was compared with autoclave and conventional heating method. The results indicated that microwave-alkali treatment was an efficient way to improve the enzymatic hydrolysis rate by increasing its accessibility hydrolysis enzymes.

Primer Design with Specific PCR Product using Particle Swarm Optimization

Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.

A Novel Architecture for Wavelet based Image Fusion

In this paper, we focus on the fusion of images from different sources using multiresolution wavelet transforms. Based on reviews of popular image fusion techniques used in data analysis, different pixel and energy based methods are experimented. A novel architecture with a hybrid algorithm is proposed which applies pixel based maximum selection rule to low frequency approximations and filter mask based fusion to high frequency details of wavelet decomposition. The key feature of hybrid architecture is the combination of advantages of pixel and region based fusion in a single image which can help the development of sophisticated algorithms enhancing the edges and structural details. A Graphical User Interface is developed for image fusion to make the research outcomes available to the end user. To utilize GUI capabilities for medical, industrial and commercial activities without MATLAB installation, a standalone executable application is also developed using Matlab Compiler Runtime.

Measuring the Comprehensibility of a UML-B Model and a B Model

Software maintenance, which involves making enhancements, modifications and corrections to existing software systems, consumes more than half of developer time. Specification comprehensibility plays an important role in software maintenance as it permits the understanding of the system properties more easily and quickly. The use of formal notation such as B increases a specification-s precision and consistency. However, the notation is regarded as being difficult to comprehend. Semi-formal notation such as the Unified Modelling Language (UML) is perceived as more accessible but it lacks formality. Perhaps by combining both notations could produce a specification that is not only accurate and consistent but also accessible to users. This paper presents an experiment conducted on a model that integrates the use of both UML and B notations, namely UML-B, versus a B model alone. The objective of the experiment was to evaluate the comprehensibility of a UML-B model compared to a traditional B model. The measurement used in the experiment focused on the efficiency in performing the comprehension tasks. The experiment employed a cross-over design and was conducted on forty-one subjects, including undergraduate and masters students. The results show that the notation used in the UML-B model is more comprehensible than the B model.

General Process Control for Intelligent Systems

Development of intelligent assembly cell conception includes new solution kind of how to create structures of automated and flexible assembly system. The current trend of the final product quality increasing is affected by time analysis of the entire manufacturing process. The primary requirement of manufacturing is to produce as many products as soon as possible, at the lowest possible cost, but of course with the highest quality. Such requirements may be satisfied only if all the elements entering and affecting the production cycle are in a fully functional condition. These elements consist of sensory equipment and intelligent control elements that are essential for building intelligent manufacturing systems. Intelligent behavior of the system as the control system will repose on monitoring of important parameters of the system in the real time. Intelligent manufacturing system itself should be a system that can flexibly respond to changes in entering and exiting the process in interaction with the surroundings.

A Semi- One Time Pad Using Blind Source Separation for Speech Encryption

We propose a new perspective on speech communication using blind source separation. The original speech is mixed with key signals which consist of the mixing matrix, chaotic signals and a random noise. However, parts of the keys (the mixing matrix and the random noise) are not necessary in decryption. In practice implement, one can encrypt the speech by changing the noise signal every time. Hence, the present scheme obtains the advantages of a One Time Pad encryption while avoiding its drawbacks in key exchange. It is demonstrated that the proposed scheme is immune against traditional attacks.

Remediation of Petroleum Hydrocarbon-contaminated Soil Slurry by Fenton Oxidation

Theobjective of this study was to evaluate the optimal treatment condition of Fenton oxidation process to removal contaminant in soil slurry contaminated by petroleum hydrocarbons. This research studied somefactors that affect the removal efficiency of petroleum hydrocarbons in soil slurry including molar ratio of hydrogen peroxide (H2O2) to ferrous ion(Fe2+), pH condition and reaction time.The resultsdemonstrated that the optimum condition was that the molar ratio of H2O2:Fe3+ was 200:1,the pHwas 4.0and the rate of reaction was increasing rapidly from starting point to 7th hour and destruction kinetic rate (k) was 0.24 h-1. Approximately 96% of petroleum hydrocarbon was observed(initialtotal petroleum hydrocarbon (TPH) concentration = 70±7gkg-1)

The Effect of Increment in Simulation Samples on a Combined Selection Procedure

Statistical selection procedures are used to select the best simulated system from a finite set of alternatives. In this paper, we present a procedure that can be used to select the best system when the number of alternatives is large. The proposed procedure consists a combination between Ranking and Selection, and Ordinal Optimization procedures. In order to improve the performance of Ordinal Optimization, Optimal Computing Budget Allocation technique is used to determine the best simulation lengths for all simulation systems and to reduce the total computation time. We also argue the effect of increment in simulation samples for the combined procedure. The results of numerical illustration show clearly the effect of increment in simulation samples on the proposed combination of selection procedure.

Numerical Solution of the Equations of Salt Diffusion into the Potato Tissues

Fick's second law equations for unsteady state diffusion of salt into the potato tissues were solved numerically. The set of equations resulted from implicit modeling were solved using Thomas method to find the salt concentration profiles in solid phase. The needed effective diffusivity and equilibrium distribution coefficient were determined experimentally. Cylindrical samples of potato were infused with aqueous NaCl solutions of 1-3% concentrations, and variations in salt concentrations of brine were determined over time. Solute concentrations profiles of samples were determined by measuring salt uptake of potato slices. For the studied conditions, equilibrium distribution coefficients were found to be dependent on salt concentrations, whereas the effective diffusivity was slightly affected by brine concentration.

Adaptive Anisotropic Diffusion for Ultrasonic Image Denoising and Edge Enhancement

Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.

The Effects of Detector Spacing on Travel Time Prediction on Freeways

Loop detectors report traffic characteristics in real time. They are at the core of traffic control process. Intuitively, one would expect that as density of detection increases, so would the quality of estimates derived from detector data. However, as detector deployment increases, the associated operating and maintenance cost increases. Thus, traffic agencies often need to decide where to add new detectors and which detectors should continue receiving maintenance, given their resource constraints. This paper evaluates the effect of detector spacing on freeway travel time estimation. A freeway section (Interstate-15) in Salt Lake City metropolitan region is examined. The research reveals that travel time accuracy does not necessarily deteriorate with increased detector spacing. Rather, the actual location of detectors has far greater influence on the quality of travel time estimates. The study presents an innovative computational approach that delivers optimal detector locations through a process that relies on Genetic Algorithm formulation.

Extended Well-Founded Semantics in Bilattices

One of the most used assumptions in logic programming and deductive databases is the so-called Closed World Assumption (CWA), according to which the atoms that cannot be inferred from the programs are considered to be false (i.e. a pessimistic assumption). One of the most successful semantics of conventional logic programs based on the CWA is the well-founded semantics. However, the CWA is not applicable in all circumstances when information is handled. That is, the well-founded semantics, if conventionally defined, would behave inadequately in different cases. The solution we adopt in this paper is to extend the well-founded semantics in order for it to be based also on other assumptions. The basis of (default) negative information in the well-founded semantics is given by the so-called unfounded sets. We extend this concept by considering optimistic, pessimistic, skeptical and paraconsistent assumptions, used to complete missing information from a program. Our semantics, called extended well-founded semantics, expresses also imperfect information considered to be missing/incomplete, uncertain and/or inconsistent, by using bilattices as multivalued logics. We provide a method of computing the extended well-founded semantics and show that Kripke-Kleene semantics is captured by considering a skeptical assumption. We show also that the complexity of the computation of our semantics is polynomial time.

A Subjectively Influenced Router for Vehicles in a Four-Junction Traffic System

A subjectively influenced router for vehicles in a fourjunction traffic system is presented. The router is based on a 3-layer Backpropagation Neural Network (BPNN) and a greedy routing procedure. The BPNN detects priorities of vehicles based on the subjective criteria. The subjective criteria and the routing procedure depend on the routing plan towards vehicles depending on the user. The routing procedure selects vehicles from their junctions based on their priorities and route them concurrently to the traffic system. That is, when the router is provided with a desired vehicles selection criteria and routing procedure, it routes vehicles with a reasonable junction clearing time. The cost evaluation of the router determines its efficiency. In the case of a routing conflict, the router will route the vehicles in a consecutive order and quarantine faulty vehicles. The simulations presented indicate that the presented approach is an effective strategy of structuring a subjective vehicle router.