Clustering in WSN Based on Minimum Spanning Tree Using Divide and Conquer Approach

Due to heavy energy constraints in WSNs clustering is an efficient way to manage the energy in sensors. There are many methods already proposed in the area of clustering and research is still going on to make clustering more energy efficient. In our paper we are proposing a minimum spanning tree based clustering using divide and conquer approach. The MST based clustering was first proposed in 1970’s for large databases. Here we are taking divide and conquer approach and implementing it for wireless sensor networks with the constraints attached to the sensor networks. This Divide and conquer approach is implemented in a way that we don’t have to construct the whole MST before clustering but we just find the edge which will be the part of the MST to a corresponding graph and divide the graph in clusters there itself if that edge from the graph can be removed judging on certain constraints and hence saving lot of computation.

Markov Game Controller Design Algorithms

Markov games are a generalization of Markov decision process to a multi-agent setting. Two-player zero-sum Markov game framework offers an effective platform for designing robust controllers. This paper presents two novel controller design algorithms that use ideas from game-theory literature to produce reliable controllers that are able to maintain performance in presence of noise and parameter variations. A more widely used approach for controller design is the H∞ optimal control, which suffers from high computational demand and at times, may be infeasible. Our approach generates an optimal control policy for the agent (controller) via a simple Linear Program enabling the controller to learn about the unknown environment. The controller is facing an unknown environment, and in our formulation this environment corresponds to the behavior rules of the noise modeled as the opponent. Proposed controller architectures attempt to improve controller reliability by a gradual mixing of algorithmic approaches drawn from the game theory literature and the Minimax-Q Markov game solution approach, in a reinforcement-learning framework. We test the proposed algorithms on a simulated Inverted Pendulum Swing-up task and compare its performance against standard Q learning.

RadMote: A Mobile Framework for Radiation Monitoring in Nuclear Power Plants

Wireless Sensor Networks (WSNs) have attracted the attention of many researchers. This has resulted in their rapid integration in very different areas such as precision agriculture,environmental monitoring, object and event detection and military surveillance. Due to the current WSN characteristics this technology is specifically useful in industrial areas where security, reliability and autonomy are basic, such as nuclear power plants, chemical plants, and others. In this paper we present a system based on WSNs to monitor environmental conditions around and inside a nuclear power plant, specifically, radiation levels. Sensor nodes, equipped with radiation sensors, are deployed in fixed positions throughout the plant. In addition, plant staff are also equipped with mobile devices with higher capabilities than sensors such as for example PDAs able to monitor radiation levels and other conditions around them. The system enables communication between PDAs, which form a Mobile Ad-hoc Wireless Network (MANET), and allows workers to monitor remote conditions in the plant. It is particularly useful during stoppage periods for inspection or in the event of an accident to prevent risk situations.

A Case Study on Appearance Based Feature Extraction Techniques and Their Susceptibility to Image Degradations for the Task of Face Recognition

Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.

Effect of Pretreatment Method on the Content of Phenolic Compounds, Vitamin C and Antioxidant Activity of Dried Dill

Dill contains range of phytochemicals, such as vitamin C and polyphenols, which significantly contribute to their total antioxidant activity. The aim of the current research was to determine the best blanching method for processing of dill prior to microwave vacuum drying based on the content of phenolic compounds, vitamin C and free radical scavenging activity. Two blanching mediums were used – water and steam, and for part of the samples microwave pretreatment was additionally used. Evaluation of vitamin C, phenolic contents and scavenging of DPPH˙ radical in dried dill was performed. Blanching had an effect on all tested parameters and the blanching conditions are very important. After evaluation of the results, as the best method for dill pretreatment was established blanching at 90 °C for 30 seconds.

Operational risks Classification for Information Systems with Service-Oriented Architecture (Including Loss Calculation Example)

This article presents the results of a study conducted to identify operational risks for information systems (IS) with service-oriented architecture (SOA). Analysis of current approaches to risk and system error classifications revealed that the system error classes were never used for SOA risk estimation. Additionally system error classes are not normallyexperimentally supported with realenterprise error data. Through the study several categories of various existing error classifications systems are applied and three new error categories with sub-categories are identified. As a part of operational risks a new error classification scheme is proposed for SOA applications. It is based on errors of real information systems which are service providers for application with service-oriented architecture. The proposed classification approach has been used to classify SOA system errors for two different enterprises (oil and gas industry, metal and mining industry). In addition we have conducted a research to identify possible losses from operational risks.

Integrated Subset Split for Balancing Network Utilization and Quality of Routing

The overlay approach has been widely used by many service providers for Traffic Engineering (TE) in large Internet backbones. In the overlay approach, logical connections are set up between edge nodes to form a full mesh virtual network on top of the physical topology. IP routing is then run over the virtual network. Traffic engineering objectives are achieved through carefully routing logical connections over the physical links. Although the overlay approach has been implemented in many operational networks, it has a number of well-known scaling issues. This paper proposes a new approach to achieve traffic engineering without full-mesh overlaying with the help of integrated approach and equal subset split method. Traffic engineering needs to determine the optimal routing of traffic over the existing network infrastructure by efficiently allocating resource in order to optimize traffic performance on an IP network. Even though constraint-based routing [1] of Multi-Protocol Label Switching (MPLS) is developed to address this need, since it is not widely tested or debugged, Internet Service Providers (ISPs) resort to TE methods under Open Shortest Path First (OSPF), which is the most commonly used intra-domain routing protocol. Determining OSPF link weights for optimal network performance is an NP-hard problem. As it is not possible to solve this problem, we present a subset split method to improve the efficiency and performance by minimizing the maximum link utilization in the network via a small number of link weight modifications. The results of this method are compared against results of MPLS architecture [9] and other heuristic methods.

Architectural, Technological and Performance Issues in Enterprise Applications

Enterprise applications are complex systems that are hard to develop and deploy in organizations. Although software application development tools, frameworks, methodologies and patterns are rapidly developing; many projects fail by causing big costs. There are challenging issues that programmers and designers face with while working on enterprise applications. In this paper, we present the three of the significant issues: Architectural, technological and performance. The important subjects in each issue are pointed out and recommendations are given. In architectural issues the lifecycle, meta-architecture, guidelines are pointed out. .NET and Java EE platforms are presented in technological issues. The importance of performance, measuring performance and profilers are explained in performance issues.

Machine Scoring Model Using Data Mining Techniques

this article proposed a methodology for computer numerical control (CNC) machine scoring. The case study company is a manufacturer of hard disk drive parts in Thailand. In this company, sample of parts manufactured from CNC machine are usually taken randomly for quality inspection. These inspection data were used to make a decision to shut down the machine if it has tendency to produce parts that are out of specification. Large amount of data are produced in this process and data mining could be very useful technique in analyzing them. In this research, data mining techniques were used to construct a machine scoring model called 'machine priority assessment model (MPAM)'. This model helps to ensure that the machine with higher risk of producing defective parts be inspected before those with lower risk. If the defective prone machine is identified sooner, defective part and rework could be reduced hence improving the overall productivity. The results showed that the proposed method can be successfully implemented and approximately 351,000 baht of opportunity cost could have saved in the case study company.

Strategies for Developing e-LMS for Tanzania Secondary Schools

Tanzania secondary schools in rural areas are geographically and socially isolated, hence face a number of problems in getting learning materials resulting in poor performance in National examinations. E-learning as defined to be the use of information and communication technology (ICT) for supporting the educational processes has motivated Tanzania to apply ICT in its education system. There has been effort to improve secondary school education using ICT through several projects. ICT for e-learning to Tanzania rural secondary school is one of the research projects conceived by the University of Dar-es-Salaam through its College of Engineering and Technology. The main objective of the project is to develop a tool to enable ICT support rural secondary school. The project is comprehensive with a number of components, one being development of e-learning management system (e-LMS) for Tanzania secondary schools. This paper presents strategies of developing e-LMS. It shows the importance of integrating action research methodology with the modeling methods as presented by model driven architecture (MDA) and the usefulness of Unified Modeling Language (UML) on the issue of modeling. The benefit of MDA will go along with the development based on software development life cycle (SDLC) process, from analysis and requirement phase through design and implementation stages as employed by object oriented system analysis and design approach. The paper also explains the employment of open source code reuse from open source learning platforms for the context sensitive development of the e-LMS for Tanzania secondary schools.

Novel Anti-leukemia Calanone Compounds by Quantitative Structure-Activity Relationship AM1 Semiempirical Method

Quantitative Structure-Activity Relationship (QSAR) approach for discovering novel more active Calanone derivative as anti-leukemia compound has been conducted. There are 6 experimental activities of Calanone compounds against leukemia cell L1210 that are used as material of the research. Calculation of theoretical predictors (independent variables) was performed by AM1 semiempirical method. The QSAR equation is determined by Principle Component Regression (PCR) analysis, with Log IC50 as dependent variable and the independent variables are atomic net charges, dipole moment (μ), and coefficient partition of noctanol/ water (Log P). Three novel Calanone derivatives that obtained by this research have higher activity against leukemia cell L1210 than pure Calanone.

A Review on WEB Resources in Teaching of Geotechnical Engineering

The use of computer hardware and software in education and training dates to the early 1940s, when American researchers developed flight simulators which used analog computers to generate simulated onboard instrument data.Computer software is widely used to help engineers and undergraduate student solve their problems quickly and more accurately. This paper presents the list of computer software in geotechnical engineering.

Limitations of the Analytic Hierarchy Process Technique with Respect to Geographically Distributed Stakeholders

The selection of appropriate requirements for product releases can make a big difference in a product success. The selection of requirements is done by different requirements prioritization techniques. These techniques are based on pre-defined and systematic steps to calculate the requirements relative weight. Prioritization is complicated by new development settings, shifting from traditional co-located development to geographically distributed development. Stakeholders, connected to a project, are distributed all over the world. These geographically distributions of stakeholders make it hard to prioritize requirements as each stakeholder have their own perception and expectations of the requirements in a software project. This paper discusses limitations of the Analytical Hierarchy Process with respect to geographically distributed stakeholders- (GDS) prioritization of requirements. This paper also provides a solution, in the form of a modified AHP, in order to prioritize requirements for GDS. We will conduct two experiments in this paper and will analyze the results in order to discuss AHP limitations with respect to GDS. The modified AHP variant is also validated in this paper.

Research on Applying the Continuity Care Document to Generate a Medical Record with Entry Level

Transferring patient information between medical care sites is necessary to deliver better patient care and to reduce medical cost. So developing of electronic medical records is an important trend for the world.The Continuity of Care Document (CCD) is product of collaboration between CDA and CCR standards. In this study, we will develop a system to generate medical records with entry level based on CCD template module.

Development System for Emotion Detection Based on Brain Signals and Facial Images

Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.

Towards an Understanding of how Information Technology Enables Innovation – The Innovators- Perceptions

This research attempts to explore gaps in Information Systems (IS) and innovation literatures by developing a model of Information Technology (IT) capability in enabling innovation. The research was conducted by using semi-structured interview with six innovators in business consulting, financial, healthcare and academic organizations. The interview results suggest four elements of ITenabled innovation capability which are information (ability to capture ideas and knowledge), connectivity (ability to bridge geographical boundary and mobilize human resources), communication (ability to attain and engage relationships between human resources) and transformation (ability to change the functions and process integrations) in defining IT-enabled innovation platform. The results also suggests innovators- roles and IT capability.

Challenges to Enable Quick Start of an Environmental Monitoring with Wireless Sensor Network Technology

With the advancement of wireless sensor network technology, its practical utilization is becoming an important challange. This paper overviews my past environmental monitoring project, and discusses the process of starting the monitoring by classifying it into four steps. The steps to start environmental monitoring can be complicated, but not well discussed by researchers of wireless sensor network technology. This paper demonstrates our activity and challenges in each of the four steps to ease the process, and argues future challenges to enable quick start of environmental monitoring.

Wheat Bran Carbohydrates as Substrate for Bifidobacterium lactis Development

The present study addresses problems and solutions related to new functional food production. Wheat (Triticum aestivum L) bran obtained from industrial mill company “Dobeles dzirnavieks”, was used to investigate them as raw material like nutrients for Bifidobacterium lactis Bb-12. Enzymatic hydrolysis of wheat bran starch was carried out by α-amylase from Bacillus amyloliquefaciens (Sigma Aldrich). The Viscozyme L purchased from (Sigma Aldrich) were used for reducing released sugar. Bifidibacterium lactis Bb-12 purchased from (Probio-Tec® CHR Hansen) was cultivated in enzymatically hydrolysed wheat bran mash. All procedures ensured the number of active Bifidobacterium lactis Bb-12 in the final product reached 105 CFUg-1. After enzymatic and bacterial fermentations sample were freeze dried for analysis of chemical compounds. All experiments were performed at Faculty of Food Technology of Latvia University of Agriculture in January- March 2013. The obtained results show that both types of wheat bran (enzymatically treated and non-treated) influenced the fermentative activity and number of Bifidibacterium lactis Bb-12 viable in wheat bran mash. Amount of acidity strongly increase during the wheat bran mash fermentation. The main objective of this work was to create low-energy functional enzymatically and bacterially treated food from wheat bran using enzymatic hydrolysis of carbohydrates and following cultivation of Bifidobacterium lactis Bb-12.

Matching-Based Cercospora Leaf Spot Detection in Sugar Beet

In this paper, we propose a robust disease detection method, called adaptive orientation code matching (Adaptive OCM), which is developed from a robust image registration algorithm: orientation code matching (OCM), to achieve continuous and site-specific detection of changes in plant disease. We use two-stage framework for realizing our research purpose; in the first stage, adaptive OCM was employed which could not only realize the continuous and site-specific observation of disease development, but also shows its excellent robustness for non-rigid plant object searching in scene illumination, translation, small rotation and occlusion changes and then in the second stage, a machine learning method of support vector machine (SVM) based on a feature of two dimensional (2D) xy-color histogram is further utilized for pixel-wise disease classification and quantification. The indoor experiment results demonstrate the feasibility and potential of our proposed algorithm, which could be implemented in real field situation for better observation of plant disease development.

Corporate Culture and Innovation: Implications for Reward Systems

Continuous innovation is becoming a necessity if firms want to stay competitive. Different factors influence the rate of innovation in a firm, among which corporate culture has often been recognized among the most important factors. In this paper we argue that the development of corporate culture that will support and foster innovation must be accompanied with an appropriate reward system. A research conducted among Croatian firms showed that a statistically significant relationship exists among corporate culture that supports innovations and reward system features.