Low Energy Method for Data Delivery in Ubiquitous Network

Recent advances in wireless sensor networks have led to many routing methods designed for energy-efficiency in wireless sensor networks. Despite that many routing methods have been proposed in USN, a single routing method cannot be energy-efficient if the environment of the ubiquitous sensor network varies. We present the controlling network access to various hosts and the services they offer, rather than on securing them one by one with a network security model. When ubiquitous sensor networks are deployed in hostile environments, an adversary may compromise some sensor nodes and use them to inject false sensing reports. False reports can lead to not only false alarms but also the depletion of limited energy resource in battery powered networks. The interleaved hop-by-hop authentication scheme detects such false reports through interleaved authentication. This paper presents a LMDD (Low energy method for data delivery) algorithm that provides energy-efficiency by dynamically changing protocols installed at the sensor nodes. The algorithm changes protocols based on the output of the fuzzy logic which is the fitness level of the protocols for the environment.

Investigating the Determinants of Purchase Intention in C2C E-Commerce

This study aims to examine the determinants of purchase intention in C2C e-commerce. Specifically the role of instant messaging in the C2C e-commerce contextis investigated. In addition to instant messaging, we brought in two antecedents of purchase intention - trust and customer satisfaction - to establish a theoretical research model. Structural equation modeling using LISREL was used to analyze the data.We discussed the research findings and suggested some implications for researchers and practitioners.

Impovement of a Label Extraction Method for a Risk Search System

This paper proposes an improvement method of classification efficiency in a classification model. The model is used in a risk search system and extracts specific labels from articles posted at bulletin board sites. The system can analyze the important discussions composed of the articles. The improvement method introduces ensemble learning methods that use multiple classification models. Also, it introduces expressions related to the specific labels into generation of word vectors. The paper applies the improvement method to articles collected from three bulletin board sites selected by users and verifies the effectiveness of the improvement method.

Modeling of Fluid Flow in 2D Triangular, Sinusoidal, and Square Corrugated Channels

The main focus of the work was concerned with hydrodynamic and thermal analysis of the plate heat exchanger channel with corrugation patterns suggested to be triangular, sinusoidal, and square corrugation. This study was to numerically model and validate the triangular corrugated channel with dimensions/parameters taken from open literature, and then model/analyze both sinusoidal, and square corrugated channel referred to the triangular model. Initially, 2D modeling with local extensive analysis for triangular corrugated channel was carried out. By that, all local pressure drop, wall shear stress, friction factor, static temperature, heat flux, Nusselt number, and surface heat coefficient, were analyzed to interpret the hydrodynamic and thermal phenomena occurred in the flow. Furthermore, in order to facilitate confidence in this model, a comparison between the values predicted, and experimental results taken from literature for almost the same case, was done. Moreover, a holistic numerical study for sinusoidal and square channels together with global comparisons with triangular corrugation under the same condition, were handled. Later, a comparison between electric, and fluid cooling through varying the boundary condition was achieved. The constant wall temperature and constant wall heat flux boundary conditions were employed, and the different resulted Nusselt numbers as a consequence were justified. The results obtained can be used to come up with an optimal design, a 'compromise' between heat transfer and pressure drop.

Innovation Strategy in Slovak Businesses

The aim of the paper is based on detailed analysis of literary sources and carried out research to develop a model development and implementation of innovation strategy in the business. The paper brings the main results of the authors conducted research on a sample of 462 respondents that shows the current situation in the Slovak enterprises in the use of innovation strategy. Carried out research and analysis provided the base for a model development and implementation of innovation strategy in the business, which is in the paper in detail, step by step explained with emphasis on the implementation process. Implementing the innovation strategy is described a separate model. Paper contains recommendations for successful implementation of innovation strategy in the business. These recommendations should serve mainly business managers as valuable tool in implementing the innovation strategy.

Slug Tracking Simulation of Severe Slugging Experiments

Experimental data from an atmospheric air/water terrain slugging case has been made available by the Shell Amsterdam research center, and has been subject to numerical simulation and comparison with a one-dimensional two-phase slug tracking simulator under development at the Norwegian University of Science and Technology. The code is based on tracking of liquid slugs in pipelines by use of a Lagrangian grid formulation implemented in Cµ by use of object oriented techniques. An existing hybrid spatial discretization scheme is tested, in which the stratified regions are modelled by the two-fluid model. The slug regions are treated incompressible, thus requiring a single momentum balance over the whole slug. Upon comparison with the experimental data, the period of the simulated severe slugging cycle is observed to be sensitive to slug generation in the horizontal parts of the system. Two different slug initiation methods have been tested with the slug tracking code, and grid dependency has been investigated.

Study on the Effect of Road Infrastructure, Socio-Economic and Demographic Features on Road Crashes in Bangladesh

Road crashes not only claim lives and inflict injuries but also create economic burden to the society due to loss of productivity. The problem of deaths and injuries as a result of road traffic crashes is now acknowledged to be a global phenomenon with authorities in virtually all countries of the world concerned about the growth in the number of people killed and seriously injured on their roads. However, the road crash scenario of a developing country like Bangladesh is much worse comparing with this of developed countries. For developing proper countermeasures it is necessary to identify the factors affecting crash occurrences. The objectives of the study is to examine the effect of district wise road infrastructure, socioeconomic and demographic features on crash occurrence .The unit of analysis will be taken as individual district which has not been explored much in the past. Reported crash data obtained from Bangladesh Road Transport Authority (BRTA) from the year 2004 to 2010 are utilized to develop negative binomial model. The model result will reveal the effect of road length (both paved and unpaved), road infrastructure and several socio economic characteristics on district level crash frequency in Bangladesh.

Statistical Reliability Based Modeling of Series and Parallel Operating Systems using Extreme Value Theory

This paper tries to represent a new method for computing the reliability of a system which is arranged in series or parallel model. In this method we estimate life distribution function of whole structure using the asymptotic Extreme Value (EV) distribution of Type I, or Gumbel theory. We use EV distribution in minimal mode, for estimate the life distribution function of series structure and maximal mode for parallel system. All parameters also are estimated by Moments method. Reliability function and failure (hazard) rate and p-th percentile point of each function are determined. Other important indexes such as Mean Time to Failure (MTTF), Mean Time to repair (MTTR), for non-repairable and renewal systems in both of series and parallel structure will be computed.

A Novel Model for Simultaneously Minimising Costs and Risks in Just-in-Time Systems Using Multi-Backup Suppliers: Part 2- Results

This paper implements the inventory model developed in the first part of this paper in a simplified problem to simultaneously reduce costs and risks in JIT systems. This model is developed to ascertain an optimal ordering strategy for procuring raw materials by using regular multi-external and local backup suppliers to reduce the total cost of the products, and at the same time to reduce the risks arising from this cost reduction within production systems. A comparison between the cost of using the JIT system and using the proposed inventory model shows the superiority of the use of the inventory model.

Institutional Efficiency of Commonhold Industrial Parks Using a Polynomial Regression Model

Based on assumptions of neo-classical economics and rational choice / public choice theory, this paper investigates the regulation of industrial land use in Taiwan by homeowners associations (HOAs) as opposed to traditional government administration. The comparison, which applies the transaction cost theory and a polynomial regression analysis, manifested that HOAs are superior to conventional government administration in terms of transaction costs and overall efficiency. A case study that compares Taiwan-s commonhold industrial park, NangKang Software Park, to traditional government counterparts using limited data on the costs and returns was analyzed. This empirical study on the relative efficiency of governmental and private institutions justified the important theoretical proposition. Numerical results prove the efficiency of the established model.

Using Radial Basis Function Neural Networks to Calibrate Water Quality Model

Modern managements of water distribution system (WDS) need water quality models that are able to accurately predict the dynamics of water quality variations within the distribution system environment. Before water quality models can be applied to solve system problems, they should be calibrated. Although former researchers use GA solver to calibrate relative parameters, it is difficult to apply on the large-scale or medium-scale real system for long computational time. In this paper a new method is designed which combines both macro and detailed model to optimize the water quality parameters. This new combinational algorithm uses radial basis function (RBF) metamodeling as a surrogate to be optimized for the purpose of decreasing the times of time-consuming water quality simulation and can realize rapidly the calibration of pipe wall reaction coefficients of chlorine model of large-scaled WDS. After two cases study this method is testified to be more efficient and promising, and deserve to generalize in the future.

Graphic Analysis of Genotype by Environment Interaction for Maize Hybrid Yield Using Site Regression Stability Model

Selection of maize (Zea mays) hybrids with wide adaptability across diverse farming environments is important, prior to recommending them to achieve a high rate of hybrid adoption. Grain yield of 14 maize hybrids, tested in a randomized completeblock design with four replicates across 22 environments in Iran, was analyzed using site regression (SREG) stability model. The biplot technique facilitates a visual evaluation of superior genotypes, which is useful for cultivar recommendation and mega-environment identification. The objectives of this study were (i) identification of suitable hybrids with both high mean performance and high stability (ii) to determine mega-environments for maize production in Iran. Biplot analysis identifies two mega-environments in this study. The first mega-environments included KRM, KSH, MGN, DZF A, KRJ, DRB, DZF B, SHZ B, and KHM, where G10 hybrid was the best performing hybrid. The second mega-environment included ESF B, ESF A, and SHZ A, where G4 hybrid was the best hybrid. According to the ideal-hybrid biplot, G10 hybrid was better than all other hybrids, followed by the G1 and G3 hybrids. These hybrids were identified as best hybrids that have high grain yield and high yield stability. GGE biplot analysis provided a framework for identifying the target testing locations that discriminates genotypes that are high yielding and stable.

Human Body Configuration using Bayesian Model

In this paper we present a novel approach for human Body configuration based on the Silhouette. We propose to address this problem under the Bayesian framework. We use an effective Model based MCMC (Markov Chain Monte Carlo) method to solve the configuration problem, in which the best configuration could be defined as MAP (maximize a posteriori probability) in Bayesian model. This model based MCMC utilizes the human body model to drive the MCMC sampling from the solution space. It converses the original high dimension space into a restricted sub-space constructed by the human model and uses a hybrid sampling algorithm. We choose an explicit human model and carefully select the likelihood functions to represent the best configuration solution. The experiments show that this method could get an accurate configuration and timesaving for different human from multi-views.

Advanced Neural Network Learning Applied to Pulping Modeling

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

A Framework of the Factors Affecting the Adoption of ICT for Physical Education

Physical education (PE) is still neglected in schools despite its academic, social, psychological, and health benefits. Based on the assumption that Information and Communication Technologies (ICTs) can contribute to the development of PE in schools, this study aims to design a model of the factors affecting the adoption of ICTs for PE in schools. The proposed model is based on a sound theoretical framework. It was designed following a literature review of technology adoption theories and of ICT adoption factors for physical education. The technology adoption model that fitted to the best all ICT adoption factors was then chosen as the basis for the proposed model. It was found that the Unified Theory of Acceptance and Use of Technology (UTAUT) is the most adequate theoretical framework for the modeling of ICT adoption factors for physical education.

Location Update Cost Analysis of Mobile IPv6 Protocols

Mobile IP has been developed to provide the continuous information network access to mobile users. In IP-based mobile networks, location management is an important component of mobility management. This management enables the system to track the location of mobile node between consecutive communications. It includes two important tasks- location update and call delivery. Location update is associated with signaling load. Frequent updates lead to degradation in the overall performance of the network and the underutilization of the resources. It is, therefore, required to devise the mechanism to minimize the update rate. Mobile IPv6 (MIPv6) and Hierarchical MIPv6 (HMIPv6) have been the potential candidates for deployments in mobile IP networks for mobility management. HMIPv6 through studies has been shown with better performance as compared to MIPv6. It reduces the signaling overhead traffic by making registration process local. In this paper, we present performance analysis of MIPv6 and HMIPv6 using an analytical model. Location update cost function is formulated based on fluid flow mobility model. The impact of cell residence time, cell residence probability and user-s mobility is investigated. Numerical results are obtained and presented in graphical form. It is shown that HMIPv6 outperforms MIPv6 for high mobility users only and for low mobility users; performance of both the schemes is almost equivalent to each other.

Measuring Relative Efficiency of Korean Construction Company using DEA/Window

Sub-prime mortgage crisis which began in the US is regarded as the most economic crisis since the Great Depression in the early 20th century. Especially, hidden problems on efficient operation of a business were disclosed at a time and many financial institutions went bankrupt and filed for court receivership. The collapses of physical market lead to bankruptcy of manufacturing and construction businesses. This study is to analyze dynamic efficiency of construction businesses during the five years at the turn of the global financial crisis. By discovering the trend and stability of efficiency of a construction business, this study-s objective is to improve management efficiency of a construction business in the ever-changing construction market. Variables were selected by analyzing corporate information on top 20 construction businesses in Korea and analyzed for static efficiency in 2008 and dynamic efficiency between 2006 and 2010. Unlike other studies, this study succeeded in deducing efficiency trend and stability of a construction business for five years by using the DEA/Window model. Using the analysis result, efficient and inefficient companies could be figured out. In addition, relative efficiency among DMU was measured by comparing the relationship between input and output variables of construction businesses. This study can be used as a literature to improve management efficiency for companies with low efficiency based on efficiency analysis of construction businesses.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Region Segmentation based on Gaussian Dirichlet Process Mixture Model and its Application to 3D Geometric Stricture Detection

In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.

The Fundamental Reliance of Iterative Learning Control on Stability Robustness

Iterative learning control aims to achieve zero tracking error of a specific command. This is accomplished by iteratively adjusting the command given to a feedback control system, based on the tracking error observed in the previous iteration. One would like the iterations to converge to zero tracking error in spite of any error present in the model used to design the learning law. First, this need for stability robustness is discussed, and then the need for robustness of the property that the transients are well behaved. Methods of producing the needed robustness to parameter variations and to singular perturbations are presented. Then a method involving reverse time runs is given that lets the world behavior produce the ILC gains in such a way as to eliminate the need for a mathematical model. Since the real world is producing the gains, there is no issue of model error. Provided the world behaves linearly, the approach gives an ILC law with both stability robustness and good transient robustness, without the need to generate a model.