Performance Improvement of Moving Object Recognition and Tracking Algorithm using Parallel Processing of SURF and Optical Flow

The paper proposes a way of parallel processing of SURF and Optical Flow for moving object recognition and tracking. The object recognition and tracking is one of the most important task in computer vision, however disadvantage are many operations cause processing speed slower so that it can-t do real-time object recognition and tracking. The proposed method uses a typical way of feature extraction SURF and moving object Optical Flow for reduce disadvantage and real-time moving object recognition and tracking, and parallel processing techniques for speed improvement. First analyse that an image from DB and acquired through the camera using SURF for compared to the same object recognition then set ROI (Region of Interest) for tracking movement of feature points using Optical Flow. Secondly, using Multi-Thread is for improved processing speed and recognition by parallel processing. Finally, performance is evaluated and verified efficiency of algorithm throughout the experiment.

Design Alternatives for Lateral Force-Resisting Systems of Tall Buildings in Dubai, UAE

Four design alternatives for lateral force-resisting systems of tall buildings in Dubai, UAE are presented. Quantitative comparisons between the different designs are also made. This paper is intended to provide different feasible lateral systems to be used in Dubai in light of the available seismic hazard studies of the UAE. The different lateral systems are chosen in conformance with the International Building Code (IBC). Moreover, the expected behavior of each system is highlighted and light is shed on some of the cost implications associated with lateral system selection.

Technology Trend and Level Assessment Using Patent Data for Preliminary Feasibility Study on R and D Program

The Korean government has applied preliminary feasibility study for new and huge R&D programs since 2008.The study is carried out from the viewpoints of technology, policy, and Economics. Then integrate the separate analysis and finally arrive at a definite result; whether a program is feasible or unfeasible, This paper describes the concept and method of the feasibility analysis focused on technological viability assessment for technical analysis. It consists of technology trend assessment and technology level assessment. Through the analysis, we can determine the chance of schedule delay or cost overrun occurring in the proposed plan.

Experimental Study on Smart Anchor Head

Since prestressed concrete members rely on the tensile strength of the prestressing strands to resist loads, loss of even few them could result catastrophic. Therefore, it is important to measure present residual prestress force. Although there are some techniques for obtaining present prestress force, some problems still remain. One method is to install load cell in front of anchor head but this may increase cost. Load cell is a transducer using the elastic material property. Anchor head is also an elastic material and this might result in monitoring monitor present prestress force. Features of fiber optic sensor such as small size, great sensitivity, high durability can assign sensing function to anchor head. This paper presents the concept of smart anchor head which acts as load cell and experiment for the applicability of it. Test results showed the smart anchor head worked good and strong linear relationship between load and response.

Richtmyer-Meshkov Instability and Gas-Particle Interaction of Contoured Shock-Tube Flows: A Numerical Study

In this paper, computational fluid dynamics (CFD) is utilized to characterize a prototype biolistic delivery system, the biomedical device based on the contoured-shock-tube design (CST), with the aim at investigating shocks induced flow instabilities within the contoured shock tube. The shock/interface interactions, the growth of perturbation at an interface between two fluids of different density are interrogated. The key features of the gas dynamics and gas-particle interaction are discussed

Investigation Wintering And Breeding Habitat Selection by Asiatic Houbara Bustard (Chlamydotis macqueenii ) In Central Steppe of Iran

Asiatic Houbara ( Chlamydotis macqueenii ) is a flagship and vulnerable species. In-situ conservation of this threatened species demands for knowledge of its habitat selection. The aim of this study was to determine habitat variables influencing birds wintering and breeding selection in semi- arid central Iran. Habitat features of the detected nest and pellet sites were compared with paired and random plots by quantifying a number of habitat variables. In wintering habitat use at micro scale houbara selected sites where vegetation cover was significantly lower compard to control sites( p< 0.001). Areas with low number of larger plant species (p=0.03) that were not too close to a vegetation patch(p

Finite-Horizon Tracking Control for Repetitive Systems with Uncertain Initial Conditions

Repetitive systems stand for a kind of systems that perform a simple task on a fixed pattern repetitively, which are widely spread in industrial fields. Hence, many researchers have been interested in those systems, especially in the field of iterative learning control (ILC). In this paper, we propose a finite-horizon tracking control scheme for linear time-varying repetitive systems with uncertain initial conditions. The scheme is derived both analytically and numerically for state-feedback systems and only numerically for output-feedback systems. Then, it is extended to stable systems with input constraints. All numerical schemes are developed in the forms of linear matrix inequalities (LMIs). A distinguished feature of the proposed scheme from the existing iterative learning control is that the scheme guarantees the tracking performance exactly even under uncertain initial conditions. The simulation results demonstrate the good performance of the proposed scheme.

A New Approach to Workforce Planning

In today-s global and competitive market, manufacturing companies are working hard towards improving their production system performance. Most companies develop production systems that can help in cost reduction. Manufacturing systems consist of different elements including production methods, machines, processes, control and information systems. Human issues are an important part of manufacturing systems, yet most companies do not pay sufficient attention to them. In this paper, a workforce planning (WP) model is presented. A non-linear programming model is developed in order to minimize the hiring, firing, training and overtime costs. The purpose is to determine the number of workers for each worker type, the number of workers trained, and the number of overtime hours. Moreover, a decision support system (DSS) based on the proposed model is introduced using the Excel-Lingo software interfacing feature. This model will help to improve the interaction between the workers, managers and the technical systems in manufacturing.

Optimal Path Planning under Priori Information in Stochastic, Time-varying Networks

A novel path planning approach is presented to solve optimal path in stochastic, time-varying networks under priori traffic information. Most existing studies make use of dynamic programming to find optimal path. However, those methods are proved to be unable to obtain global optimal value, moreover, how to design efficient algorithms is also another challenge. This paper employs a decision theoretic framework for defining optimal path: for a given source S and destination D in urban transit network, we seek an S - D path of lowest expected travel time where its link travel times are discrete random variables. To solve deficiency caused by the methods of dynamic programming, such as curse of dimensionality and violation of optimal principle, an integer programming model is built to realize assignment of discrete travel time variables to arcs. Simultaneously, pruning techniques are also applied to reduce computation complexity in the algorithm. The final experiments show the feasibility of the novel approach.

Robust Detection of R-Wave Using Wavelet Technique

Electrocardiogram (ECG) is considered to be the backbone of cardiology. ECG is composed of P, QRS & T waves and information related to cardiac diseases can be extracted from the intervals and amplitudes of these waves. The first step in extracting ECG features starts from the accurate detection of R peaks in the QRS complex. We have developed a robust R wave detector using wavelets. The wavelets used for detection are Daubechies and Symmetric. The method does not require any preprocessing therefore, only needs the ECG correct recordings while implementing the detection. The database has been collected from MIT-BIH arrhythmia database and the signals from Lead-II have been analyzed. MatLab 7.0 has been used to develop the algorithm. The ECG signal under test has been decomposed to the required level using the selected wavelet and the selection of detail coefficient d4 has been done based on energy, frequency and cross-correlation analysis of decomposition structure of ECG signal. The robustness of the method is apparent from the obtained results.

A Neural-Network-Based Fault Diagnosis Approach for Analog Circuits by Using Wavelet Transformation and Fractal Dimension as a Preprocessor

This paper presents a new method of analog fault diagnosis based on back-propagation neural networks (BPNNs) using wavelet decomposition and fractal dimension as preprocessors. The proposed method has the capability to detect and identify faulty components in an analog electronic circuit with tolerance by analyzing its impulse response. Using wavelet decomposition to preprocess the impulse response drastically de-noises the inputs to the neural network. The second preprocessing by fractal dimension can extract unique features, which are the fed to a neural network as inputs for further classification. A comparison of our work with [1] and [6], which also employs back-propagation (BP) neural networks, reveals that our system requires a much smaller network and performs significantly better in fault diagnosis of analog circuits due to our proposed preprocessing techniques.

Weed Classification using Histogram Maxima with Threshold for Selective Herbicide Applications

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Maxima with threshold of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Hierarchical PSO-Adaboost Based Classifiers for Fast and Robust Face Detection

We propose a fast and robust hierarchical face detection system which finds and localizes face images with a cascade of classifiers. Three modules contribute to the efficiency of our detector. First, heterogeneous feature descriptors are exploited to enrich feature types and feature numbers for face representation. Second, a PSO-Adaboost algorithm is proposed to efficiently select discriminative features from a large pool of available features and reinforce them into the final ensemble classifier. Compared with the standard exhaustive Adaboost for feature selection, the new PSOAdaboost algorithm reduces the training time up to 20 times. Finally, a three-stage hierarchical classifier framework is developed for rapid background removal. In particular, candidate face regions are detected more quickly by using a large size window in the first stage. Nonlinear SVM classifiers are used instead of decision stump functions in the last stage to remove those remaining complex nonface patterns that can not be rejected in the previous two stages. Experimental results show our detector achieves superior performance on the CMU+MIT frontal face dataset.

A New Spectral-based Approach to Query-by-Humming for MP3 Songs Database

In this paper, we propose a new approach to query-by-humming, focusing on MP3 songs database. Since MP3 songs are much more difficult in melody representation than symbolic performance data, we adopt to extract feature descriptors from the vocal sounds part of the songs. Our approach is based on signal filtering, sub-band spectral processing, MDCT coefficients analysis and peak energy detection by ignorance of the background music as much as possible. Finally, we apply dual dynamic programming algorithm for feature similarity matching. Experiments will show us its online performance in precision and efficiency.

Color Constancy using Superpixel

Color constancy algorithms are generally based on the simplified assumption about the spectral distribution or the reflection attributes of the scene surface. However, in reality, these assumptions are too restrictive. The methodology is proposed to extend existing algorithm to applying color constancy locally to image patches rather than globally to the entire images. In this paper, a method based on low-level image features using superpixels is proposed. Superpixel segmentation partition an image into regions that are approximately uniform in size and shape. Instead of using entire pixel set for estimating the illuminant, only superpixels with the most valuable information are used. Based on large scale experiments on real-world scenes, it can be derived that the estimation is more accurate using superpixels than when using the entire image.

Fundamental Concepts of Theory of Constraints: An Emerging Philosophy

Dr Eliyahu Goldratt has done the pioneering work in the development of Theory of Constraints. Since then, many more researchers around the globe are working to enhance this body of knowledge. In this paper, an attempt has been made to compile the salient features of this theory from the work done by Goldratt and other researchers. This paper will provide a good starting point to the potential researchers interested to work in Theory of Constraints. The paper will also help the practicing managers by clarifying their concepts on the theory and will facilitate its successful implementation in their working areas.

A Novel Nucleus-Based Classifier for Discrimination of Osteoclasts and Mesenchymal Precursor Cells in Mouse Bone Marrow Cultures

Bone remodeling occurs by the balanced action of bone resorbing osteoclasts (OC) and bone-building osteoblasts. Increased bone resorption by excessive OC activity contributes to malignant and non-malignant diseases including osteoporosis. To study OC differentiation and function, OC formed in in vitro cultures are currently counted manually, a tedious procedure which is prone to inter-observer differences. Aiming for an automated OC-quantification system, classification of OC and precursor cells was done on fluorescence microscope images based on the distinct appearance of fluorescent nuclei. Following ellipse fitting to nuclei, a combination of eight features enabled clustering of OC and precursor cell nuclei. After evaluating different machine-learning techniques, LOGREG achieved 74% correctly classified OC and precursor cell nuclei, outperforming human experts (best expert: 55%). In combination with the automated detection of total cell areas, this system allows to measure various cell parameters and most importantly to quantify proteins involved in osteoclastogenesis.

In Search of an SVD and QRcp Based Optimization Technique of ANN for Automatic Classification of Abnormal Heart Sounds

Artificial Neural Network (ANN) has been extensively used for classification of heart sounds for its discriminative training ability and easy implementation. However, it suffers from overparameterization if the number of nodes is not chosen properly. In such cases, when the dataset has redundancy within it, ANN is trained along with this redundant information that results in poor validation. Also a larger network means more computational expense resulting more hardware and time related cost. Therefore, an optimum design of neural network is needed towards real-time detection of pathological patterns, if any from heart sound signal. The aims of this work are to (i) select a set of input features that are effective for identification of heart sound signals and (ii) make certain optimum selection of nodes in the hidden layer for a more effective ANN structure. Here, we present an optimization technique that involves Singular Value Decomposition (SVD) and QR factorization with column pivoting (QRcp) methodology to optimize empirically chosen over-parameterized ANN structure. Input nodes present in ANN structure is optimized by SVD followed by QRcp while only SVD is required to prune undesirable hidden nodes. The result is presented for classifying 12 common pathological cases and normal heart sound.

An Optimization Analysis on an Automotive Component with Fatigue Constraint Using HyperWorks Software for Environmental Sustainability

A finite element analysis (FEA) computer software HyperWorks is utilized in re-designing an automotive component to reduce its mass. Reduction of components mass contributes towards environmental sustainability by saving world-s valuable metal resources and by reducing carbon emission through improved overall vehicle fuel efficiency. A shape optimization analysis was performed on a rear spindle component. Pre-processing and solving procedures were performed using HyperMesh and RADIOSS respectively. Shape variables were defined using HyperMorph. Then optimization solver OptiStruct was utilized with fatigue life set as a design constraint. Since Stress-Number of Cycle (S-N) theory deals with uni-axial stress, the Signed von Misses stress on the component was used for looking up damage on S-N curve, and Gerber criterion for mean stress corrections. The optimization analysis resulted in mass reduction of 24% of the original mass. The study proved that the adopted approach has high potential use for environmental sustainability.

Bridging Quantitative and Qualitative of Glaucoma Detection

Glaucoma diagnosis involves extracting three features of the fundus image; optic cup, optic disc and vernacular. Present manual diagnosis is expensive, tedious and time consuming. A number of researches have been conducted to automate this process. However, the variability between the diagnostic capability of an automated system and ophthalmologist has yet to be established. This paper discusses the efficiency and variability between ophthalmologist opinion and digital technique; threshold. The efficiency and variability measures are based on image quality grading; poor, satisfactory or good. The images are separated into four channels; gray, red, green and blue. A scientific investigation was conducted on three ophthalmologists who graded the images based on the image quality. The images are threshold using multithresholding and graded as done by the ophthalmologist. A comparison of grade from the ophthalmologist and threshold is made. The results show there is a small variability between result of ophthalmologists and digital threshold.