A Study of RSCMAC Enhanced GPS Dynamic Positioning

The purpose of this research is to develop and apply the RSCMAC to enhance the dynamic accuracy of Global Positioning System (GPS). GPS devices provide services of accurate positioning, speed detection and highly precise time standard for over 98% area on the earth. The overall operation of Global Positioning System includes 24 GPS satellites in space; signal transmission that includes 2 frequency carrier waves (Link 1 and Link 2) and 2 sets random telegraphic codes (C/A code and P code), on-earth monitoring stations or client GPS receivers. Only 4 satellites utilization, the client position and its elevation can be detected rapidly. The more receivable satellites, the more accurate position can be decoded. Currently, the standard positioning accuracy of the simplified GPS receiver is greatly increased, but due to affected by the error of satellite clock, the troposphere delay and the ionosphere delay, current measurement accuracy is in the level of 5~15m. In increasing the dynamic GPS positioning accuracy, most researchers mainly use inertial navigation system (INS) and installation of other sensors or maps for the assistance. This research utilizes the RSCMAC advantages of fast learning, learning convergence assurance, solving capability of time-related dynamic system problems with the static positioning calibration structure to improve and increase the GPS dynamic accuracy. The increasing of GPS dynamic positioning accuracy can be achieved by using RSCMAC system with GPS receivers collecting dynamic error data for the error prediction and follows by using the predicted error to correct the GPS dynamic positioning data. The ultimate purpose of this research is to improve the dynamic positioning error of cheap GPS receivers and the economic benefits will be enhanced while the accuracy is increased.

Residue and Temporal Trend of Polychlorinated Biphenyls (PCBs) in Surface Soils from Bacninh, Vietnam

An evaluation of the PCBs residues in the surface soils from Bacninh, Vietnam was carried out. Sixty representative soil samples were collected from the centre of Bacninh and three surrounding districts. The analyzed results indicated the wide extent of contamination of total PCBs in Bacninh. In industrial and urban zones, total PCBs concentrations ranged from ranged from

Measuring the Level of Housing Defects in the Build-Then-Sell Housing Delivery System

When the Malaysian government announced the implementation of the Build-Then-Sell (BTS) system in 2007, the proponents of the BTS have argued that the implementation of this new system may provide houses with low defects. However, there has been no empirical data to support their argument. Therefore, this study is conducted to measure the level of housing defects in the BTS housing delivery system. A survey was conducted to the occupiers in six BTS residential areas. The BTS residential areas have been identified through the media and because of the small number of population, all households in the BTS residential areas were required to participate in the study to enable the researcher to collect the data concerning defects. Questionnaire had been employed as the data collection instrument and was distributed to the respondents of this study. The result has shown that the level of defects in the BTS houses is low, as the rate of defects for all elements are slight. Such low level of defects has apparently only affected the aesthetic value of the houses.

Many-Sided Self Risk Analysis Model for Information Asset to Secure Stability of the Information and Communication Service

Information and communication service providers (ICSP) that are significant in size and provide Internet-based services take administrative, technical, and physical protection measures via the information security check service (ISCS). These protection measures are the minimum action necessary to secure the stability and continuity of the information and communication services (ICS) that they provide. Thus, information assets are essential to providing ICS, and deciding the relative importance of target assets for protection is a critical procedure. The risk analysis model designed to decide the relative importance of information assets, which is described in this study, evaluates information assets from many angles, in order to choose which ones should be given priority when it comes to protection. Many-sided risk analysis (MSRS) grades the importance of information assets, based on evaluation of major security check items, evaluation of the dependency on the information and communication facility (ICF) and influence on potential incidents, and evaluation of major items according to their service classification, in order to identify the ISCS target. MSRS could be an efficient risk analysis model to help ICSPs to identify their core information assets and take information protection measures first, so that stability of the ICS can be ensured.

Clarification of Synthetic Juice through Spiral Wound Ultrafiltration Module at Turbulent Flow Region and Cleaning Study

Synthetic juice clarification was done through spiral wound ultrafiltration (UF) membrane module. Synthetic juice was clarified at two different operating conditions, such as, with and without permeates recycle at turbulent flow regime. The performance of spiral wound ultrafiltration membrane was analyzed during clarification of synthetic juice. Synthetic juice was the mixture of deionized water, sucrose and pectin molecule. The operating conditions are: feed flowrate of 10 lpm, pressure drop of 413.7 kPa and Reynolds no of 5000. Permeate sample was analyzed in terms of volume reduction factor (VRF), viscosity (Pa.s), ⁰Brix, TDS (mg/l), electrical conductivity (μS) and turbidity (NTU). It was observe that the permeate flux declined with operating time for both conditions of with and without permeate recycle due to increase of concentration polarization and increase of gel layer on membrane surface. For without permeate recycle, the membrane fouling rate was faster compared to with permeate recycle. For without permeate recycle, the VRF rose up to 5 and for with recycle permeate the VRF is 1.9. The VRF is higher due to adsorption of solute (pectin) molecule on membrane surface and resulting permeateflux declined with VRF. With permeate recycle, quality was within acceptable limit. Fouled membrane was cleaned by applying different processes (e.g., deionized water, SDS and EDTA solution). Membrane cleaning was analyzed in terms of permeability recovery.

Improved Wavelet Neural Networks for Early Cancer Diagnosis Using Clustering Algorithms

Wavelet neural networks (WNNs) have emerged as a vital alternative to the vastly studied multilayer perceptrons (MLPs) since its first implementation. In this paper, we applied various clustering algorithms, namely, K-means (KM), Fuzzy C-means (FCM), symmetry-based K-means (SBKM), symmetry-based Fuzzy C-means (SBFCM) and modified point symmetry-based K-means (MPKM) clustering algorithms in choosing the translation parameter of a WNN. These modified WNNs are further applied to the heterogeneous cancer classification using benchmark microarray data and were compared against the conventional WNN with random initialization method. Experimental results showed that a WNN classifier with the MPKM algorithm is more precise than the conventional WNN as well as the WNNs with other clustering algorithms.

Propagation of Viscous Waves and Activation Energy of Hydrocarbon Fluids

The Euler-s equation of motion is extended to include the viscosity stress tensor leading to the formulation of Navier– Stokes type equation. The latter is linearized and applied to investigate the rotational motion or vorticity in a viscous fluid. Relations for the velocity of viscous waves and attenuation parameter are obtained in terms of viscosity (μ) and the density (¤ü) of the fluid. μ and ¤ü are measured experimentally as a function of temperature for two different samples of light and heavy crude oil. These data facilitated to determine the activation energy, velocity of viscous wave and the attenuation parameter. Shear wave velocity in heavy oil is found to be much larger than the light oil, whereas the attenuation parameter in heavy oil is quite low in comparison to light one. The activation energy of heavy oil is three times larger than light oil.

Symbolic Analysis of Large Circuits Using Discrete Wavelet Transform

Symbolic Circuit Analysis (SCA) is a technique used to generate the symbolic expression of a network. It has become a well-established technique in circuit analysis and design. The symbolic expression of networks offers excellent way to perform frequency response analysis, sensitivity computation, stability measurements, performance optimization, and fault diagnosis. Many approaches have been proposed in the area of SCA offering different features and capabilities. Numerical Interpolation methods are very common in this context, especially by using the Fast Fourier Transform (FFT). The aim of this paper is to present a method for SCA that depends on the use of Wavelet Transform (WT) as a mathematical tool to generate the symbolic expression for large circuits with minimizing the analysis time by reducing the number of computations.

Developing and Implementing Successful Key Performance Indicators

Measurement and the following evaluation of performance represent important part of management. The paper focuses on indicators as the basic elements of performance measurement system. It emphasizes a necessity of searching requirements for quality indicators so that they can become part of the useful system. It introduces standpoints for a systematic dividing of indicators so that they have as high as possible informative value of background sources for searching, analysis, designing and using of indicators. It draws attention to requirements for indicators' quality and at the same it deals with some dangers decreasing indicator's informative value. It submits a draft of questions that should be answered at the construction of indicator. It is obvious that particular indicators need to be defined exactly to stimulate the desired behavior in order to attain expected results. In the enclosure a concrete example of the defined indicator in the concrete conditions of a small firm is given. The authors of the paper pay attention to the fact that a quality indicator makes it possible to get to the basic causes of the problem and include the established facts into the company information system. At the same time they emphasize that developing of a quality indicator is a prerequisite for the utilization of the system of measurement in management.

Measurement of UHF Signal Strength Propagating from Road Surface with Vehicle Obstruction

Radio wave propagation on the road surface is a major problem on wireless sensor network for traffic monitoring. In this paper, we compare receiving signal strength on two scenarios 1) an empty road and 2) a road with a vehicle. We investigate the effect of antenna polarization and antenna height to the receiving signal strength. The transmitting antenna is installed on the road surface. The receiving signal is measured 360 degrees around the transmitting antenna with the radius of 2.5 meters. Measurement results show the receiving signal fluctuation around the transmitting antenna in both scenarios. Receiving signal with vertical polarization antenna results in higher signal strength than horizontal polarization antenna. The optimum antenna elevation is 1 meter for both horizon and vertical polarizations with the vehicle on the road. In the empty road, the receiving signal level is unvarying with the elevation when the elevation is greater than 1.5 meters.

Comparison of Imputation Techniques for Efficient Prediction of Software Fault Proneness in Classes

Missing data is a persistent problem in almost all areas of empirical research. The missing data must be treated very carefully, as data plays a fundamental role in every analysis. Improper treatment can distort the analysis or generate biased results. In this paper, we compare and contrast various imputation techniques on missing data sets and make an empirical evaluation of these methods so as to construct quality software models. Our empirical study is based on NASA-s two public dataset. KC4 and KC1. The actual data sets of 125 cases and 2107 cases respectively, without any missing values were considered. The data set is used to create Missing at Random (MAR) data Listwise Deletion(LD), Mean Substitution(MS), Interpolation, Regression with an error term and Expectation-Maximization (EM) approaches were used to compare the effects of the various techniques.

Nanocrystalline Na0.1V2O5.nH2O Xerogel Thin Film for Gas Sensing

Nanocrystalline thin film of Na0.1V2O5.nH2O xerogel obtained by sol gel synthesis was used as gas sensor. Gas sensing properties of different gases such as hydrogen, petroleum and humidity were investigated. Applying XRD and TEM the size of the nanocrystals is found to be 7.5 nm. SEM shows a highly porous structure with submicron meter-sized voids present throughout the sample. FTIR measurement shows different chemical groups identifying the obtained series of gels. The sample was n-type semiconductor according to the thermoelectric power and electrical conductivity. It can be seen that the sensor response curves from 130oC to 150oC show a rapid increase in sensitivity for all types of gas injection, low response values for heating period and the rapid high response values for cooling period. This result may suggest that this material is able to act as gas sensor during the heating and cooling process.

A Thermal-Shock Fatigue Design of Automotive Heat Exchangers

A method is presented for using thermo-mechanical fatigue analysis as a tool in the design of automotive heat exchangers. Use of infra-red thermography to measure the real thermal history in the heat exchanger reduces the time necessary for calculating design parameters and improves prediction accuracy. Thermal shocks are the primary cause of heat exchanger damage. Thermo-mechanical simulation is based on the mean behavior of the aluminum tubes used in the heat exchanger. An energetic fatigue criterion is used to detect critical zones.

New Hybrid Method to Correct for Wind Tunnel Wall- and Support Interference On-line

Because support interference corrections are not properly understood, engineers mostly rely on expensive dummy measurements or CFD calculations. This paper presents a method based on uncorrected wind tunnel measurements and fast calculation techniques (it is a hybrid method) to calculate wall interference, support interference and residual interference (when e.g. a support member closely approaches the wind tunnel walls) for any type of wind tunnel and support configuration. The method provides with a simple formula for the calculation of the interference gradient. This gradient is based on the uncorrected measurements and a successive calculation of the slopes of the interference-free aerodynamic coefficients. For the latter purpose a new vortex-lattice routine is developed that corrects the slopes for viscous effects. A test case of a measurement on a wing proves the value of this hybrid method as trends and orders of magnitudes of the interference are correctly determined.

Forest Growth Simulation: Tropical Rain Forest Stand Table Projection

The study on the tree growth for four species groups of commercial timber in Koh Kong province, Cambodia-s tropical rainforest is described. The simulation for these four groups had been successfully developed in the 5-year interval through year-60. Data were obtained from twenty permanent sample plots in the duration of thirteen years. The aim for this study was to develop stand table simulation system of tree growth by the species group. There were five steps involved in the development of the tree growth simulation: aggregate the tree species into meaningful groups by using cluster analysis; allocate the trees in the diameter classes by the species group; observe the diameter movement of the species group. The diameter growth rate, mortality rate and recruitment rate were calculated by using some mathematical formula. Simulation equation had been created by combining those parameters. Result showed the dissimilarity of the diameter growth among species groups.

Design, Analysis and Modeling of Dual Band Microstrip Loop Antenna Using Defective Ground Plane

Present wireless communication demands compact and intelligent devices with multitasking capabilities at affordable cost. The focus in the presented paper is on a dual band antenna for wireless communication with the capability of operating at two frequency bands with same structure. Two resonance frequencies are observed with the second operation band at 4.2GHz approximately three times the first resonance frequency at 1.5GHz. Structure is simple loop of microstrip line with characteristic impedance 50 ohms. The proposed antenna is designed using defective ground structure (DGS) and shows the nearly one third reductions in size as compared to without DGS. This antenna was simulated on electromagnetic (EM) simulation software and fabricated using microwave integrated circuit technique on RT-Duroid dielectric substrate (εr= 2.22) of thickness (H=15 mils). The designed antenna was tested on automatic network analyzer and shows the good agreement with simulated results. The proposed structure is modeled into an equivalent electrical circuit and simulated on circuit simulator. Subsequently, theoretical analysis was carried out and simulated. The simulated, measured, equivalent circuit response, and theoretical results shows good resemblance. The bands of operation draw many potential applications in today’s wireless communication.

Quantity and Quality Aware Artificial Bee Colony Algorithm for Clustering

Artificial Bee Colony (ABC) algorithm is a relatively new swarm intelligence technique for clustering. It produces higher quality clusters compared to other population-based algorithms but with poor energy efficiency, cluster quality consistency and typically slower in convergence speed. Inspired by energy saving foraging behavior of natural honey bees this paper presents a Quality and Quantity Aware Artificial Bee Colony (Q2ABC) algorithm to improve quality of cluster identification, energy efficiency and convergence speed of the original ABC. To evaluate the performance of Q2ABC algorithm, experiments were conducted on a suite of ten benchmark UCI datasets. The results demonstrate Q2ABC outperformed ABC and K-means algorithm in the quality of clusters delivered.

Emotion Classification for Students with Autism in Mathematics E-learning using Physiological and Facial Expression Measures

Avoiding learning failures in mathematics e-learning environments caused by emotional problems in students with autism has become an important topic for combining of special education with information and communications technology. This study presents an adaptive emotional adjustment model in mathematics e-learning for students with autism, emphasizing the lack of emotional perception in mathematics e-learning systems. In addition, an emotion classification for students with autism was developed by inducing emotions in mathematical learning environments to record changes in the physiological signals and facial expressions of students. Using these methods, 58 emotional features were obtained. These features were then processed using one-way ANOVA and information gain (IG). After reducing the feature dimension, methods of support vector machines (SVM), k-nearest neighbors (KNN), and classification and regression trees (CART) were used to classify four emotional categories: baseline, happy, angry, and anxious. After testing and comparisons, in a situation without feature selection, the accuracy rate of the SVM classification can reach as high as 79.3-%. After using IG to reduce the feature dimension, with only 28 features remaining, SVM still has a classification accuracy of 78.2-%. The results of this research could enhance the effectiveness of eLearning in special education.

Scene Adaptive Shadow Detection Algorithm

Robustness is one of the primary performance criteria for an Intelligent Video Surveillance (IVS) system. One of the key factors in enhancing the robustness of dynamic video analysis is,providing accurate and reliable means for shadow detection. If left undetected, shadow pixels may result in incorrect object tracking and classification, as it tends to distort localization and measurement information. Most of the algorithms proposed in literature are computationally expensive; some to the extent of equalling computational requirement of motion detection. In this paper, the homogeneity property of shadows is explored in a novel way for shadow detection. An adaptive division image (which highlights homogeneity property of shadows) analysis followed by a relatively simpler projection histogram analysis for penumbra suppression is the key novelty in our approach.

Inspection of Geometrical Integrity of Work Piece and Measurement of Tool Wear by the Use of Photo Digitizing Method

Considering complexity of products, new geometrical design and investment tolerances that are necessary, measuring and dimensional controlling involve modern and more precise methods. Photo digitizing method using two cameras to record pictures and utilization of conventional method named “cloud points" and data analysis by the use of ATOUS software, is known as modern and efficient in mentioned context. In this paper, benefits of photo digitizing method in evaluating sampling of machining processes have been put forward. For example, assessment of geometrical integrity surface in 5-axis milling process and measurement of carbide tool wear in turning process, can be can be brought forward. Advantages of this method comparing to conventional methods have been expressed.