Human Verification in a Video Surveillance System Using Statistical Features

A human verification system is presented in this paper. The system consists of several steps: background subtraction, thresholding, line connection, region growing, morphlogy, star skelatonization, feature extraction, feature matching, and decision making. The proposed system combines an advantage of star skeletonization and simple statistic features. A correlation matching and probability voting have been used for verification, followed by a logical operation in a decision making stage. The proposed system uses small number of features and the system reliability is convincing.

A New Time Dependent, High Temperature Analytical Model for the Single-electron Box in Digital Applications

Several models have been introduced so far for single electron box, SEB, which all of them were restricted to DC response and or low temperature limit. In this paper we introduce a new time dependent, high temperature analytical model for SEB for the first time. DC behavior of the introduced model will be verified against SIMON software and its time behavior will be verified against a newly published paper regarding step response of SEB.

Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Challenges to Technological Advancement in Economically Weak Countries: An Assessment of the Nigerian Educational Situation

Nigeria is considered as one of the many countries in sub-Saharan Africa with a weak economy and gross deficiencies in technology and engineering. Available data from international monitoring and regulatory organizations show that technology is pivotal to determining the economic strengths of nations all over the world. Education is critical to technology acquisition, development, dissemination and adaptation. Thus, this paper seeks to critically assess and discuss issues and challenges facing technological advancement in Nigeria, particularly in the education sector, and also proffers solutions to resuscitate the Nigerian education system towards achieving national technological and economic sustainability such that Nigeria can compete favourably with other technologicallydriven economies of the world in the not-too-distant future.

FIR Filter Design via Linear Complementarity Problem, Messy Genetic Algorithm, and Ising Messy Genetic Algorithm

In this paper the design of maximally flat linear phase finite impulse response (FIR) filters is considered. The problem is handled with totally two different approaches. The first one is completely deterministic numerical approach where the problem is formulated as a Linear Complementarity Problem (LCP). The other one is based on a combination of Markov Random Fields (MRF's) approach with messy genetic algorithm (MGA). Markov Random Fields (MRFs) are a class of probabilistic models that have been applied for many years to the analysis of visual patterns or textures. Our objective is to establish MRFs as an interesting approach to modeling messy genetic algorithms. We establish a theoretical result that every genetic algorithm problem can be characterized in terms of a MRF model. This allows us to construct an explicit probabilistic model of the MGA fitness function and introduce the Ising MGA. Experimentations done with Ising MGA are less costly than those done with standard MGA since much less computations are involved. The least computations of all is for the LCP. Results of the LCP, random search, random seeded search, MGA, and Ising MGA are discussed.

A Sub-Pixel Image Registration Technique with Applications to Defect Detection

This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.

Analytical Model Prediction: Micro-Cutting Tool Forces with the Effect of Friction on Machining Titanium Alloy (Ti-6Al-4V)

In this paper, a methodology of a model based on predicting the tool forces oblique machining are introduced by adopting the orthogonal technique. The applied analytical calculation is mostly based on Devries model and some parts of the methodology are employed from Amareggo-Brown model. Model validation is performed by comparing experimental data with the prediction results on machining titanium alloy (Ti-6Al-4V) based on micro-cutting tool perspective. Good agreements with the experiments are observed. A detailed friction form that affected the tool forces also been examined with reasonable results obtained.

Customer Knowledge and Service Development, the Web 2.0 Role in Co-production

The paper is concerned with relationships between SSME and ICTs and focuses on the role of Web 2.0 tools in the service development process. The research presented aims at exploring how collaborative technologies can support and improve service processes, highlighting customer centrality and value coproduction. The core idea of the paper is the centrality of user participation and the collaborative technologies as enabling factors; Wikipedia is analyzed as an example. The result of such analysis is the identification and description of a pattern characterising specific services in which users collaborate by means of web tools with value co-producers during the service process. The pattern of collaborative co-production concerning several categories of services including knowledge based services is then discussed.

Genetic Algorithm Parameters Optimization for Bi-Criteria Multiprocessor Task Scheduling Using Design of Experiments

Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.

Decomposition of Graphs into Induced Paths and Cycles

A decomposition of a graph G is a collection ψ of subgraphs H1,H2, . . . , Hr of G such that every edge of G belongs to exactly one Hi. If each Hi is either an induced path or an induced cycle in G, then ψ is called an induced path decomposition of G. The minimum cardinality of an induced path decomposition of G is called the induced path decomposition number of G and is denoted by πi(G). In this paper we initiate a study of this parameter.

Effect of Oxygen Annealing on the Surface Defects and Photoconductivity of Vertically Aligned ZnO Nanowire Array

Post growth annealing of solution grown ZnO nanowire array is performed under controlled oxygen ambience. The role of annealing over surface defects and their consequence on dark/photo-conductivity and photosensitivity of nanowire array is investigated. Surface defect properties are explored using various measurement tools such as contact angle, photoluminescence, Raman spectroscopy and XPS measurements. The contact angle of the NW films reduces due to oxygen annealing and nanowire film surface changes from hydrophobic (96°) to hydrophilic (16°). Raman and XPS spectroscopy reveal that oxygen annealing improves the crystal quality of the nanowire films. The defect band emission intensity (relative to band edge emission, ID/IUV) reduces from 1.3 to 0.2 after annealing at 600 °C at 10 SCCM flow of oxygen. An order enhancement in dark conductivity is observed in O2 annealed samples, while photoconductivity is found to be slightly reduced due to lower concentration of surface related oxygen defects.

An Experimental and Numerical Investigation on Gas Hydrate Plug Flow in the Inclined Pipes and Bends

Gas hydrates can agglomerate and block multiphase oil and gas pipelines when water is present at hydrate forming conditions. Using "Cold Flow Technology", the aim is to condition gas hydrates so that they can be transported as a slurry mixture without a risk of agglomeration. During the pipeline shut down however, hydrate particles may settle in bends and build hydrate plugs. An experimental setup has been designed and constructed to study the flow of such plugs at start up operations. Experiments have been performed using model fluid and model hydrate particles. The propagations of initial plugs in a bend were recorded with impedance probes along the pipe. The experimental results show a dispersion of the plug front. A peak in pressure drop was also recorded when the plugs were passing the bend. The evolutions of the plugs have been simulated by numerical integration of the incompressible mass balance equations, with an imposed mixture velocity. The slip between particles and carrier fluid has been calculated using a drag relation together with a particle-fluid force balance.

Drafting the Design and Development of Micro- Controller Based Portable Soil Moisture Sensor for Advancement in Agro Engineering

Moisture is an important consideration in many aspects ranging from irrigation, soil chemistry, golf course, corrosion and erosion, road conditions, weather predictions, livestock feed moisture levels, water seepage etc. Vegetation and crops always depend more on the moisture available at the root level than on precipitation occurrence. In this paper, design of an instrument is discussed which tells about the variation in the moisture contents of soil. This is done by measuring the amount of water content in soil by finding the variation in capacitance of soil with the help of a capacitive sensor. The greatest advantage of soil moisture sensor is reduced water consumption. The sensor is also be used to set lower and upper threshold to maintain optimum soil moisture saturation and minimize water wilting, contributes to deeper plant root growth ,reduced soil run off /leaching and less favorable condition for insects and fungal diseases. Capacitance method is preferred because, it provides absolute amount of water content and also measures water content at any depth.

MC and IC – What Is the Relationship?

MC (Management Control)& IC (Internal Control) – what is the relationship? (an empirical study into the definitions between MC and IC) based on the wider considerations of Internal Control and Management Control terms, attention is focused not only on the financial aspects but also more on the soft aspects of the business, such as culture, behaviour, standards and values. The limited considerations of Management Control are focused mainly in the hard, financial aspects of business operation. The definitions of Management Control and Internal Control are often used interchangeably and the results of this empirical study reveal that Management Control is part of Internal Control, there is no causal link between the two concepts. Based on the interpretation of the respondents, the term Management Control has moved from a broad term to a more limited term with the soft aspects of the influencing of behaviour, performance measurements, incentives and culture. This paper is an exploratory study based on qualitative research and on a qualitative matrix method analysis of the thematic definition of the terms Management Control and Internal Control.

Development of a RAM Simulation Model for Acid Gas Removal System

A reliability, availability and maintainability (RAM) model has been built for acid gas removal plant for system analysis that will play an important role in any process modifications, if required, for achieving its optimum performance. Due to the complexity of the plant, the model was based on a Reliability Block Diagram (RBD) with a Monte Carlo simulation engine. The model has been validated against actual plant data as well as local expert opinions, resulting in an acceptable simulation model. The results from the model showed that the operation and maintenance can be further improved, resulting in reduction of the annual production loss.

Artificial Neural Network Prediction for Coke Strength after Reaction and Data Analysis

In this paper, the requirement for Coke quality prediction, its role in Blast furnaces, and the model output is explained. By applying method of Artificial Neural Networking (ANN) using back propagation (BP) algorithm, prediction model has been developed to predict CSR. Important blast furnace functions such as permeability, heat exchanging, melting, and reducing capacity are mostly connected to coke quality. Coke quality is further dependent upon coal characterization and coke making process parameters. The ANN model developed is a useful tool for process experts to adjust the control parameters in case of coke quality deviations. The model also makes it possible to predict CSR for new coal blends which are yet to be used in Coke Plant. Input data to the model was structured into 3 modules, for tenure of past 2 years and the incremental models thus developed assists in identifying the group causing the deviation of CSR.

Analytical Study of Component Based Software Engineering

This paper is a survey of current component-based software technologies and the description of promotion and inhibition factors in CBSE. The features that software components inherit are also discussed. Quality Assurance issues in componentbased software are also catered to. The feat research on the quality model of component based system starts with the study of what the components are, CBSE, its development life cycle and the pro & cons of CBSE. Various attributes are studied and compared keeping in view the study of various existing models for general systems and CBS. When illustrating the quality of a software component an apt set of quality attributes for the description of the system (or components) should be selected. Finally, the research issues that can be extended are tabularized.

Enhancing Performance of Bluetooth Piconets Using Priority Scheduling and Exponential Back-Off Mechanism

Bluetooth is a personal wireless communication technology and is being applied in many scenarios. It is an emerging standard for short range, low cost, low power wireless access technology. Current existing MAC (Medium Access Control) scheduling schemes only provide best-effort service for all masterslave connections. It is very challenging to provide QoS (Quality of Service) support for different connections due to the feature of Master Driven TDD (Time Division Duplex). However, there is no solution available to support both delay and bandwidth guarantees required by real time applications. This paper addresses the issue of how to enhance QoS support in a Bluetooth piconet. The Bluetooth specification proposes a Round Robin scheduler as possible solution for scheduling the transmissions in a Bluetooth Piconet. We propose an algorithm which will reduce the bandwidth waste and enhance the efficiency of network. We define token counters to estimate traffic of real-time slaves. To increase bandwidth utilization, a back-off mechanism is then presented for best-effort slaves to decrease the frequency of polling idle slaves. Simulation results demonstrate that our scheme achieves better performance over the Round Robin scheduling.

The Effects of Biomass Parameters on the Dissolved Organic Carbon Removal in a Sponge Submerged Membrane Bioreactor

A novel sponge submerged membrane bioreactor (SSMBR) was developed to effectively remove organics and nutrients from wastewater. Sponge is introduced within the SSMBR as a medium for the attached growth of biomass. This paper evaluates the effects of new and acclimatized sponges for dissolved organic carbon (DOC) removal from wastewater at different mixed liquor suspended solids- (MLSS) concentration of the sludge. It was observed in a series of experimental studies that the acclimatized sponge performed better than the new sponge whilst the optimum DOC removal could be achieved at 10g/L of MLSS with the acclimatized sponge. Moreover, the paper analyses the relationships between the MLSSsponge/MLSSsludge and the DOC removal efficiency of SSMBR. The results showed a non-linear relationship between the biomass parameters of the sponge and the sludge, and the DOC removal efficiency of SSMBR. A second-order polynomial function could reasonably represent these relationships.

Validation of Automation Systems using Temporal Logic Model Checking and Groebner Bases

Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.