Comparison of MFCC and Cepstral Coefficients as a Feature Set for PCG Biometric Systems

Heart sound is an acoustic signal and many techniques used nowadays for human recognition tasks borrow speech recognition techniques. One popular choice for feature extraction of accoustic signals is the Mel Frequency Cepstral Coefficients (MFCC) which maps the signal onto a non-linear Mel-Scale that mimics the human hearing. However the Mel-Scale is almost linear in the frequency region of heart sounds and thus should produce similar results with the standard cepstral coefficients (CC). In this paper, MFCC is investigated to see if it produces superior results for PCG based human identification system compared to CC. Results show that the MFCC system is still superior to CC despite linear filter-banks in the lower frequency range, giving up to 95% correct recognition rate for MFCC and 90% for CC. Further experiments show that the high recognition rate is due to the implementation of filter-banks and not from Mel-Scaling.

Speech Enhancement Using Kalman Filter in Communication

Revolutions Applications such as telecommunications, hands-free communications, recording, etc. which need at least one microphone, the signal is usually infected by noise and echo. The important application is the speech enhancement, which is done to remove suppressed noises and echoes taken by a microphone, beside preferred speech. Accordingly, the microphone signal has to be cleaned using digital signal processing DSP tools before it is played out, transmitted, or stored. Engineers have so far tried different approaches to improving the speech by get back the desired speech signal from the noisy observations. Especially Mobile communication, so in this paper will do reconstruction of the speech signal, observed in additive background noise, using the Kalman filter technique to estimate the parameters of the Autoregressive Process (AR) in the state space model and the output speech signal obtained by the MATLAB. The accurate estimation by Kalman filter on speech would enhance and reduce the noise then compare and discuss the results between actual values and estimated values which produce the reconstructed signals.

A Game-Theoretic Approach to Hedonic Housing Prices

A property-s selling price is described as the result of sequential bargaining between a buyer and a seller in an environment of asymmetric information. Hedonic housing prices are estimated based upon 17,333 records of New Zealand residential properties sold during the years 2006 and 2007.

Biodiesel Production from Palm Oil using Heterogeneous Base Catalyst

In this study, the transesterification of palm oil with methanol for biodiesel production was studied by using CaO–ZnO as a heterogeneous base catalyst prepared by incipient-wetness impregnation (IWI) and co-precipitation (CP) methods. The reaction parameters considered were molar ratio of methanol to oil, amount of catalyst, reaction temperature, and reaction time. The optimum conditions–15:1 molar ratio of methanol to oil, a catalyst amount of 6 wt%, reaction temperature of 60 °C, and reaction time of 8 h–were observed. The effects of Ca loading, calcination temperature, and catalyst preparation on the catalytic performance were studied. The fresh and spent catalysts were characterized by several techniques, including XRD, TPR, and XRF.

SDVAR Algorithm for Detecting Fraud in Telecommunications

This paper presents a procedure for estimating VAR using Sequential Discounting VAR (SDVAR) algorithm for online model learning to detect fraudulent acts using the telecommunications call detailed records (CDR). The volatility of the VAR is observed allowing for non-linearity, outliers and change points based on the works of [1]. This paper extends their procedure from univariate to multivariate time series. A simulation and a case study for detecting telecommunications fraud using CDR illustrate the use of the algorithm in the bivariate setting.

Motion Detection Techniques Using Optical Flow

Motion detection is very important in image processing. One way of detecting motion is using optical flow. Optical flow cannot be computed locally, since only one independent measurement is available from the image sequence at a point, while the flow velocity has two components. A second constraint is needed. The method used for finding the optical flow in this project is assuming that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image. This technique is later used in developing software for motion detection which has the capability to carry out four types of motion detection. The motion detection software presented in this project also can highlight motion region, count motion level as well as counting object numbers. Many objects such as vehicles and human from video streams can be recognized by applying optical flow technique.

Effects of pH, Temperature, Enzyme and Substrate Concentration on Xylooligosaccharides Production

Agricultural residue such as oil palm fronds (OPF) is cheap, widespread and available throughout the year. Hemicelluloses extracted from OPF can be hydrolyzed to their monomers and used in production of xylooligosaccharides (XOs). The objective of the present study was to optimize the enzymatic hydrolysis process of OPF hemicellulose by varying pH, temperature, enzyme and substrate concentration for production of XOs. Hemicelluloses was extracted from OPF by using 3 M potassium hydroxide (KOH) at temperature of 40°C for 4 hrs and stirred at 400 rpm. The hemicellulose was then hydrolyzed using Trichoderma longibrachiatum xylanase at different pH, temperature, enzyme and substrate concentration. XOs were characterized based on reducing sugar determination. The optimum conditions to produced XOs from OPF hemicellulose was obtained at pH 4.6, temperature of 40°C , enzyme concentration of 2 U/mL and 2% substrate concentration. The results established the suitability of oil palm fronds as raw material for production of XOs.

Foot Anthropometry of Primary School Children in the South of Thailand

The objective of the research was to study of foot anthropometry of children aged 7-12 years in the South of Thailand Thirty-three dimensions were measured on 305 male and 295 female subjects with 3 age ranges (7-12 years old). The instrumentation consists of four types of anthropometer, digital vernier caliper, digital height gauge and measuring tape. The mean values and standard deviations of average age, height, and weight of the male subjects were 9.52(±1.70) years, 137.80(±11.55) cm, and 37.57(±11.65) kg. Female average age, height, and weight subjects were 9.53(±1.70) years, 137.88(±11.55) cm, and 34.90(±11.57) kg respectively. The comparison of the 33 comparison measured anthropometric. Between male and female subjects were sexual differences in size on women in almost all areas of significance (p

UD Covariance Factorization for Unscented Kalman Filter using Sequential Measurements Update

Extended Kalman Filter (EKF) is probably the most widely used estimation algorithm for nonlinear systems. However, not only it has difficulties arising from linearization but also many times it becomes numerically unstable because of computer round off errors that occur in the process of its implementation. To overcome linearization limitations, the unscented transformation (UT) was developed as a method to propagate mean and covariance information through nonlinear transformations. Kalman filter that uses UT for calculation of the first two statistical moments is called Unscented Kalman Filter (UKF). Square-root form of UKF (SRUKF) developed by Rudolph van der Merwe and Eric Wan to achieve numerical stability and guarantee positive semi-definiteness of the Kalman filter covariances. This paper develops another implementation of SR-UKF for sequential update measurement equation, and also derives a new UD covariance factorization filter for the implementation of UKF. This filter is equivalent to UKF but is computationally more efficient.

Blind Low Frequency Watermarking Method

We present a low frequency watermarking method adaptive to image content. The image content is analyzed and properties of HVS are exploited to generate a visual mask of the same size as the approximation image. Using this mask we embed the watermark in the approximation image without degrading the image quality. Watermark detection is performed without using the original image. Experimental results show that the proposed watermarking method is robust against most common image processing operations, which can be easily implemented and usually do not degrade the image quality.

Balancing Neural Trees to Improve Classification Performance

In this paper, a neural tree (NT) classifier having a simple perceptron at each node is considered. A new concept for making a balanced tree is applied in the learning algorithm of the tree. At each node, if the perceptron classification is not accurate and unbalanced, then it is replaced by a new perceptron. This separates the training set in such a way that almost the equal number of patterns fall into each of the classes. Moreover, each perceptron is trained only for the classes which are present at respective node and ignore other classes. Splitting nodes are employed into the neural tree architecture to divide the training set when the current perceptron node repeats the same classification of the parent node. A new error function based on the depth of the tree is introduced to reduce the computational time for the training of a perceptron. Experiments are performed to check the efficiency and encouraging results are obtained in terms of accuracy and computational costs.

Multipurpose Cadastre, Essential for Urban Development Plans in Iran

Majority of researches conducted on Iranian urban development plans indicate that they have been almost unsuccessful in terms of draft, execution and goal achievement. Lack or shortage of essential statistics and information can be listed as an important reason of the failure of these plans. Lack of figures and information has turned into an obvious part of the country-s statistics officials. This problem has made urban planner themselves to embark on physical surveys including real estate and land pricing, population and economic census of the city. Apart from the problems facing urban developers, the possibility of errors is high in such surveys. In the present article, applying the interview technique, it has been mentioned that utilizing multipurpose cadastre system as a land information system is essential for urban development plans in Iran. It can minimize or even remove the failures facing urban development plans.

Impact of Government Spending on Private Consumption and on the Economy: Case of Thailand

The recent global financial problem urges government to play role in stimulating the economy due to the fact that private sector has little ability to purchase during the recession. A concerned question is whether the increased government spending crowds out private consumption and whether it helps stimulate the economy. If the government spending policy is effective; the private consumption is expected to increase and can compensate the recent extra government expense. In this study, the government spending is categorized into government consumption spending and government capital spending. The study firstly examines consumer consumption along the line with the demand function in microeconomic theory. Three categories of private consumption are used in the study. Those are food consumption, non food consumption, and services consumption. The dynamic Almost Ideal Demand System of the three categories of the private consumption is estimated using the Vector Error Correction Mechanism model. The estimated model indicates the substituting effects (negative impacts) of the government consumption spending on budget shares of private non food consumption and of the government capital spending on budget share of private food consumption, respectively. Nevertheless the result does not necessarily indicate whether the negative effects of changes in the budget shares of the non food and the food consumption means fallen total private consumption. Microeconomic consumer demand analysis clearly indicates changes in component structure of aggregate expenditure in the economy as a result of the government spending policy. The macroeconomic concept of aggregate demand comprising consumption, investment, government spending (the government consumption spending and the government capital spending), export, and import are used to estimate for their relationship using the Vector Error Correction Mechanism model. The macroeconomic study found no effect of the government capital spending on either the private consumption or the growth of GDP while the government consumption spending has negative effect on the growth of GDP. Therefore no crowding out effect of the government spending is found on the private consumption but it is ineffective and even inefficient expenditure as found reducing growth of the GDP in the context of Thailand.

Effect of Catalyst Preparation on the Performance of CaO-ZnO Catalysts for Transesterification

In this research, CaO-ZnO catalysts (with various Ca:Zn atomic ratios of 1:5, 1:3, 1:1, and 3:1) prepared by incipientwetness impregnation (IWI) and co-precipitation (CP) methods were used as a catalyst in the transesterification of palm oil with methanol for biodiesel production. The catalysts were characterized by several techniques, including BET method, CO2-TPD, and Hemmett Indicator. The effects of precursor concentration, and calcination temperature on the catalytic performance were studied under reaction conditions of a 15:1 methanol to oil molar ratio, 6 wt% catalyst, reaction temperature of 60°C, and reaction time of 8 h. At Ca:Zn atomic ratio of 1:3 gave the highest FAME value owing to a basic properties and surface area of the prepared catalyst.

Application of Neural Networks for 24-Hour-Ahead Load Forecasting

One of the most important requirements for the operation and planning activities of an electrical utility is the prediction of load for the next hour to several days out, known as short term load forecasting. This paper presents the development of an artificial neural network based short-term load forecasting model. The model can forecast daily load profiles with a load time of one day for next 24 hours. In this method can divide days of year with using average temperature. Groups make according linearity rate of curve. Ultimate forecast for each group obtain with considering weekday and weekend. This paper investigates effects of temperature and humidity on consuming curve. For forecasting load curve of holidays at first forecast pick and valley and then the neural network forecast is re-shaped with the new data. The ANN-based load models are trained using hourly historical. Load data and daily historical max/min temperature and humidity data. The results of testing the system on data from Yazd utility are reported.

An Application for Web Mining Systems with Services Oriented Architecture

Although the World Wide Web is considered the largest source of information there exists nowadays, due to its inherent dynamic characteristics, the task of finding useful and qualified information can become a very frustrating experience. This study presents a research on the information mining systems in the Web; and proposes an implementation of these systems by means of components that can be built using the technology of Web services. This implies that they can encompass features offered by a services oriented architecture (SOA) and specific components may be used by other tools, independent of platforms or programming languages. Hence, the main objective of this work is to provide an architecture to Web mining systems, divided into stages, where each step is a component that will incorporate the characteristics of SOA. The separation of these steps was designed based upon the existing literature. Interesting results were obtained and are shown here.

A Generalized Framework for Working with Multiagent Systems

The present paper discusses the basic concepts and the underlying principles of Multi-Agent Systems (MAS) along with an interdisciplinary exploitation of these principles. It has been found that they have been utilized for lots of research and studies on various systems spanning across diverse engineering and scientific realms showing the need of development of a proper generalized framework. Such framework has been developed for the Multi-Agent Systems and it has been generalized keeping in mind the diverse areas where they find application. All the related aspects have been categorized and a general definition has been given where ever possible.

Delay and Energy Consumption Analysis of Conventional SRAM

The energy consumption and delay in read/write operation of conventional SRAM is investigated analytically as well as by simulation. Explicit analytical expressions for the energy consumption and delay in read and write operation as a function of device parameters and supply voltage are derived. The expressions are useful in predicting the effect of parameter changes on the energy consumption and speed as well as in optimizing the design of conventional SRAM. HSPICE simulation in standard 0.25μm CMOS technology confirms precision of analytical expressions derived from this paper.

Burstiness Reduction of a Doubly Stochastic AR-Modeled Uniform Activity VBR Video

Stochastic modeling of network traffic is an area of significant research activity for current and future broadband communication networks. Multimedia traffic is statistically characterized by a bursty variable bit rate (VBR) profile. In this paper, we develop an improved model for uniform activity level video sources in ATM using a doubly stochastic autoregressive model driven by an underlying spatial point process. We then examine a number of burstiness metrics such as the peak-to-average ratio (PAR), the temporal autocovariance function (ACF) and the traffic measurements histogram. We found that the former measure is most suitable for capturing the burstiness of single scene video traffic. In the last phase of this work, we analyse statistical multiplexing of several constant scene video sources. This proved, expectedly, to be advantageous with respect to reducing the burstiness of the traffic, as long as the sources are statistically independent. We observed that the burstiness was rapidly diminishing, with the largest gain occuring when only around 5 sources are multiplexed. The novel model used in this paper for characterizing uniform activity video was thus found to be an accurate model.

Real-time Interactive Ocean Wave Simulation using Multithread

This research simulates one of the natural phenomena, the ocean wave. Our goal is to be able to simulate the ocean wave at real-time rate with the water surface interacting with objects. The wave in this research is calm and smooth caused by the force of the wind above the ocean surface. In order to make the simulation of the wave real-time, the implementation of the GPU and the multithreading techniques are used here. Based on the fact that the new generation CPUs, for personal computers, have multi cores, they are useful for the multithread. This technique utilizes more than one core at a time. This simulation is programmed by C language with OpenGL. To make the simulation of the wave look more realistic, we applied an OpenGL technique called cube mapping (environmental mapping) to make water surface reflective and more realistic.