Practical Applications and Connectivity Algorithms in Future Wireless Sensor Networks

Like any sentient organism, a smart environment relies first and foremost on sensory data captured from the real world. The sensory data come from sensor nodes of different modalities deployed on different locations forming a Wireless Sensor Network (WSN). Embedding smart sensors in humans has been a research challenge due to the limitations imposed by these sensors from computational capabilities to limited power. In this paper, we first propose a practical WSN application that will enable blind people to see what their neighboring partners can see. The challenge is that the actual mapping between the input images to brain pattern is too complex and not well understood. We also study the connectivity problem in 3D/2D wireless sensor networks and propose distributed efficient algorithms to accomplish the required connectivity of the system. We provide a new connectivity algorithm CDCA to connect disconnected parts of a network using cooperative diversity. Through simulations, we analyze the connectivity gains and energy savings provided by this novel form of cooperative diversity in WSNs.

Detecting and Tracking Vehicles in Airborne Videos

In this work, we present an automatic vehicle detection system for airborne videos using combined features. We propose a pixel-wise classification method for vehicle detection using Dynamic Bayesian Networks. In spite of performing pixel-wise classification, relations among neighboring pixels in a region are preserved in the feature extraction process. The main novelty of the detection scheme is that the extracted combined features comprise not only pixel-level information but also region-level information. Afterwards, tracking is performed on the detected vehicles. Tracking is performed using efficient Kalman filter with dynamic particle sampling. Experiments were conducted on a wide variety of airborne videos. We do not assume prior information of camera heights, orientation, and target object sizes in the proposed framework. The results demonstrate flexibility and good generalization abilities of the proposed method on a challenging dataset.

Memory Effects in Randomly Perturbed Nematic Liquid Crystals

We study the typical domain size and configuration character of a randomly perturbed system exhibiting continuous symmetry breaking. As a model system we use rod-like objects within a cubic lattice interacting via a Lebwohl–Lasher-type interaction. We describe their local direction with a headless unit director field. An example of such systems represents nematic LC or nanotubes. We further introduce impurities of concentration p, which impose the random anisotropy field-type disorder to directors. We study the domain-type pattern of molecules as a function of p, anchoring strength w between a neighboring director and impurity, temperature, history of samples. In simulations we quenched the directors either from the random or homogeneous initial configuration. Our results show that a history of system strongly influences: i) the average domain coherence length; and ii) the range of ordering in the system. In the random case the obtained order is always short ranged (SR). On the contrary, in the homogeneous case, SR is obtained only for strong enough anchoring and large enough concentration p. In other cases, the ordering is either of quasi long range (QLR) or of long range (LR). We further studied memory effects for the random initial configuration. With increasing external ordering field B either QLR or LR is realized.

Intra Prediction using Weighted Average of Pixel Values According to Prediction Direction

In this paper, we proposed a method to reduce quantization error. In order to reduce quantization error, low pass filtering is applied on neighboring samples of current block in H.264/AVC. However, it has a weak point that low pass filtering is performed regardless of prediction direction. Since it doesn-t consider prediction direction, it may not reduce quantization error effectively. Proposed method considers prediction direction for low pass filtering and uses a threshold condition for reducing flag bit. We compare our experimental result with conventional method in H.264/AVC and we can achieve the average bit-rate reduction of 1.534% by applying the proposed method. Bit-rate reduction between 0.580% and 3.567% are shown for experimental results.

A Model for Analysis the Induced Voltage of 115 kV On-Line Acting on Neighboring 22 kV Off-Line

This paper presents a model for analysis the induced voltage of transmission lines (energized) acting on neighboring distribution lines (de-energized). From environmental restrictions, 22 kV distribution lines need to be installed under 115 kV transmission lines. With the installation of the two parallel circuits like this, they make the induced voltage which can cause harm to operators. This work was performed with the ATP-EMTP modeling to analyze such phenomenon before field testing. Simulation results are used to find solutions to prevent danger to operators who are on the pole.

An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference

The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.

A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

A Novel Fuzzy Technique for Image Noise Reduction

A new fuzzy filter is presented for noise reduction of images corrupted with additive noise. The filter consists of two stages. In the first stage, all the pixels of image are processed for determining noisy pixels. For this, a fuzzy rule based system associates a degree to each pixel. The degree of a pixel is a real number in the range [0,1], which denotes a probability that the pixel is not considered as a noisy pixel. In the second stage, another fuzzy rule based system is employed. It uses the output of the previous fuzzy system to perform fuzzy smoothing by weighting the contributions of neighboring pixel values. Experimental results are obtained to show the feasibility of the proposed filter. These results are also compared to other filters by numerical measure and visual inspection.

Hybrid Neuro Fuzzy Approach for Automatic Generation Control of Two -Area Interconnected Power System

The main objective of Automatic Generation Control (AGC) is to balance the total system generation against system load losses so that the desired frequency and power interchange with neighboring systems is maintained. Any mismatch between generation and demand causes the system frequency to deviate from its nominal value. Thus high frequency deviation may lead to system collapse. This necessitates a very fast and accurate controller to maintain the nominal system frequency. This paper deals with a novel approach of artificial intelligence (AI) technique called Hybrid Neuro-Fuzzy (HNF) approach for an (AGC). The advantage of this controller is that it can handle the non-linearities at the same time it is faster than other conventional controllers. The effectiveness of the proposed controller in increasing the damping of local and inter area modes of oscillation is demonstrated in a two area interconnected power system. The result shows that intelligent controller is having improved dynamic response and at the same time faster than conventional controller.

In Search of New Laws for a Gluten Kingdom

The enthusiasm for gluten avoidance in a growing market is met by improvements in sensitive detection methods for analysing gluten content. Paradoxically, manufacturers employ no such systems in the production process but continue to market their product as gluten free, a significant risk posed to an undetermined coeliac population. This paper resonates with an immunological response that causes gastrointestinal scarring and villous atrophy with the conventional description of personal injury. This thesis divulges into evaluating potential inadequacies of gluten labelling laws which not only present a diagnostic challenge for general practitioners in the UK but it also exposes a less than adequate form of available legal protection to those who suffer adverse reactions as a result of gluten digestion. Central to this discussion is whether a claim brought in misrepresentation, negligence and/or under the Consumer Protection Act 1987 could be sustained. An interesting comparison is then made with the legal regimes of neighboring jurisdictions furthering the theme of a legally un-catered for gluten kingdom.

Identification and Analysis of Binding Site Residues in Protein-Protein Complexes

We have developed an energy based approach for identifying the binding sites and important residues for binding in protein-protein complexes. We found that the residues and residuepairs with charged and aromatic side chains are important for binding. These residues influence to form cation-¤Ç, electrostatic and aromatic interactions. Our observation has been verified with the experimental binding specificity of protein-protein complexes and found good agreement with experiments. The analysis on surrounding hydrophobicity reveals that the binding residues are less hydrophobic than non-binding sites, which suggests that the hydrophobic core are important for folding and stability whereas the surface seeking residues play a critical role in binding. Further, the propensity of residues in the binding sites of receptors and ligands, number of medium and long-range contacts, and influence of neighboring residues will be discussed.

Analysis of a TBM Tunneling Effect on Surface Subsidence: A Case Study from Tehran, Iran

The development and extension of large cities induced a need for shallow tunnel in soft ground of building areas. Estimation of ground settlement caused by the tunnel excavation is important engineering point. In this paper, prediction of surface subsidence caused by tunneling in one section of seventh line of Tehran subway is considered. On the basis of studied geotechnical conditions of the region, tunnel with the length of 26.9km has been excavated applying a mechanized method using an EPB-TBM with a diameter of 9.14m. In this regard, settlement is estimated utilizing both analytical and numerical finite element method. The numerical method shows that the value of settlement in this section is 5cm. Besides, the analytical consequences (Bobet and Loganathan-Polous) are 5.29 and 12.36cm, respectively. According to results of this study, due tosaturation of this section, there are good agreement between Bobet and numerical methods. Therefore, tunneling processes in this section needs a special consolidation measurement and support system before the passage of tunnel boring machine.

Hippocampus Segmentation using a Local Prior Model on its Boundary

Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.

Dispersed Error Control based on Error Filter Design for Improving Halftone Image Quality

The error diffusion method generates worm artifacts, and weakens the edge of the halftone image when the continuous gray scale image is reproduced by a binary image. First, to enhance the edges, we propose the edge-enhancing filter by considering the quantization error information and gradient of the neighboring pixels. Furthermore, to remove worm artifacts often appearing in a halftone image, we add adaptively random noise into the weights of an error filter.

Gabriel-constrained Parametric Surface Triangulation

The Boundary Representation of a 3D manifold contains FACES (connected subsets of a parametric surface S : R2 -! R3). In many science and engineering applications it is cumbersome and algebraically difficult to deal with the polynomial set and constraints (LOOPs) representing the FACE. Because of this reason, a Piecewise Linear (PL) approximation of the FACE is needed, which is usually represented in terms of triangles (i.e. 2-simplices). Solving the problem of FACE triangulation requires producing quality triangles which are: (i) independent of the arguments of S, (ii) sensitive to the local curvatures, and (iii) compliant with the boundaries of the FACE and (iv) topologically compatible with the triangles of the neighboring FACEs. In the existing literature there are no guarantees for the point (iii). This article contributes to the topic of triangulations conforming to the boundaries of the FACE by applying the concept of parameterindependent Gabriel complex, which improves the correctness of the triangulation regarding aspects (iii) and (iv). In addition, the article applies the geometric concept of tangent ball to a surface at a point to address points (i) and (ii). Additional research is needed in algorithms that (i) take advantage of the concepts presented in the heuristic algorithm proposed and (ii) can be proved correct.

An Efficient and Optimized Multi Constrained Path Computation for Real Time Interactive Applications in Packet Switched Networks

Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.

The Implicit Methods for the Study of Tolerance

Tolerance is a tool for achieving a social cohesion, particularly, among individuals and groups with different values. The aim is to study the characteristics of the ethnic tolerance, the inhabitants of Latvia. The ethnic tolerance is taught as a set of conscious and unconscious orientations of the individual in social interaction and inter-ethnic communication. It uses the tools of empirical studies of the ethnic tolerance which allows to identify the explicitly and implicitly levels of the emotional component of Latvia's residents. Explicit measurements were made using the techniques of self-report which revealed the index of the ethnic tolerance and the ethnic identity of the participants. The implicit component was studied using methods based on the effect of the emotional priming. During the processing of the results, there were calculated indicators of the positive and negative implicit attitudes towards members of their own and other ethnicity as well as the explicit parameters of the ethnic tolerance and the ethnic identity of Latvia-s residents. The implicit measurements of the ratio of neighboring ethnic groups against each other showed a mutual negative attitude whereas the explicit measurements indicate a neutral attitude. The data obtained contribute to a further study of the ethnic tolerance of Latvia's residents.

Elastic-Plastic Contact Analysis of Single Layer Solid Rough Surface Model using FEM

Evaluation of contact pressure, surface and subsurface contact stresses are essential to know the functional response of surface coatings and the contact behavior mainly depends on surface roughness, material property, thickness of layer and the manner of loading. Contact parameter evaluation of real rough surface contacts mostly relies on statistical single asperity contact approaches. In this work, a three dimensional layered solid rough surface in contact with a rigid flat is modeled and analyzed using finite element method. The rough surface of layered solid is generated by FFT approach. The generated rough surface is exported to a finite element method based ANSYS package through which the bottom up solid modeling is employed to create a deformable solid model with a layered solid rough surface on top. The discretization and contact analysis are carried by using the same ANSYS package. The elastic, elastoplastic and plastic deformations are continuous in the present finite element method unlike many other contact models. The Young-s modulus to yield strength ratio of layer is varied in the present work to observe the contact parameters effect while keeping the surface roughness and substrate material properties as constant. The contacting asperities attain elastic, elastoplastic and plastic states with their continuity and asperity interaction phenomena is inherently included. The resultant contact parameters show that neighboring asperity interaction and the Young-s modulus to yield strength ratio of layer influence the bulk deformation consequently affect the interface strength.

Current Situation and Possible Solutions of Acid Rain in South Korea

Environmental statistics reveals that the pollution of acid rain in South Korea is a serious issue. Yet the awareness of people is low. Even after a gradual decrease of pollutant emission in Korea, the acidity has not been reduced. There no boundaries in the atmosphere are set and the influence of the neighboring countries such as China is apparent. Governmental efforts among China, Japan and Korea have been made on this issue. However, not much progress has been observed. Along with the governmental activities, therefore, an active monitoring of the pollution among the countries and the promotion of environmental awareness at the civil level including especially the middle and high schools are highly recommended. It will be this young generation who will face damaged country as inheritance not the current generation.