Very-high-Precision Normalized Eigenfunctions for a Class of Schrödinger Type Equations

We demonstrate that it is possible to compute wave function normalization constants for a class of Schr¨odinger type equations by an algorithm which scales linearly (in the number of eigenfunction evaluations) with the desired precision P in decimals.

Player Number Localization and Recognition in Soccer Video using HSV Color Space and Internal Contours

Detection of player identity is challenging task in sport video content analysis. In case of soccer video player number recognition is effective and precise solution. Jersey numbers can be considered as scene text and difficulties in localization and recognition appear due to variations in orientation, size, illumination, motion etc. This paper proposed new method for player number localization and recognition. By observing hue, saturation and value for 50 different jersey examples we noticed that most often combination of low and high saturated pixels is used to separate number and jersey region. Image segmentation method based on this observation is introduced. Then, novel method for player number localization based on internal contours is proposed. False number candidates are filtered using area and aspect ratio. Before OCR processing extracted numbers are enhanced using image smoothing and rotation normalization.

Trajectory Guided Recognition of Hand Gestures having only Global Motions

One very interesting field of research in Pattern Recognition that has gained much attention in recent times is Gesture Recognition. In this paper, we consider a form of dynamic hand gestures that are characterized by total movement of the hand (arm) in space. For these types of gestures, the shape of the hand (palm) during gesturing does not bear any significance. In our work, we propose a model-based method for tracking hand motion in space, thereby estimating the hand motion trajectory. We employ the dynamic time warping (DTW) algorithm for time alignment and normalization of spatio-temporal variations that exist among samples belonging to the same gesture class. During training, one template trajectory and one prototype feature vector are generated for every gesture class. Features used in our work include some static and dynamic motion trajectory features. Recognition is accomplished in two stages. In the first stage, all unlikely gesture classes are eliminated by comparing the input gesture trajectory to all the template trajectories. In the next stage, feature vector extracted from the input gesture is compared to all the class prototype feature vectors using a distance classifier. Experimental results demonstrate that our proposed trajectory estimator and classifier is suitable for Human Computer Interaction (HCI) platform.

A Content Based Image Watermarking Scheme Resilient to Geometric Attacks

Multimedia security is an incredibly significant area of concern. The paper aims to discuss a robust image watermarking scheme, which can withstand geometric attacks. The source image is initially moment normalized in order to make it withstand geometric attacks. The moment normalized image is wavelet transformed. The first level wavelet transformed image is segmented into blocks if size 8x8. The product of mean and standard and standard deviation of each block is computed. The second level wavelet transformed image is divided into 8x8 blocks. The product of block mean and the standard deviation are computed. The difference between products in the two levels forms the watermark. The watermark is inserted by modulating the coefficients of the mid frequencies. The modulated image is inverse wavelet transformed and inverse moment normalized to generate the watermarked image. The watermarked image is now ready for transmission. The proposed scheme can be used to validate identification cards and financial instruments. The performance of this scheme has been evaluated using a set of parameters. Experimental results show the effectiveness of this scheme.

Arterial Stiffness Detection Depending on Neural Network Classification of the Multi- Input Parameters

Diagnostic and detection of the arterial stiffness is very important; which gives indication of the associated increased risk of cardiovascular diseases. To make a cheap and easy method for general screening technique to avoid the future cardiovascular complexes , due to the rising of the arterial stiffness ; a proposed algorithm depending on photoplethysmogram to be used. The photoplethysmograph signals would be processed in MATLAB. The signal will be filtered, baseline wandering removed, peaks and valleys detected and normalization of the signals should be achieved .The area under the catacrotic phase of the photoplethysmogram pulse curve is calculated using trapezoidal algorithm ; then will used in cooperation with other parameters such as age, height, blood pressure in neural network for arterial stiffness detection. The Neural network were implemented with sensitivity of 80%, accuracy 85% and specificity of 90% were got from the patients data. It is concluded that neural network can detect the arterial STIFFNESS depending on risk factor parameters.

Evaluation of Clustering Based on Preprocessing in Gene Expression Data

Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.

SMaTTS: Standard Malay Text to Speech System

This paper presents a rule-based text- to- speech (TTS) Synthesis System for Standard Malay, namely SMaTTS. The proposed system using sinusoidal method and some pre- recorded wave files in generating speech for the system. The use of phone database significantly decreases the amount of computer memory space used, thus making the system very light and embeddable. The overall system was comprised of two phases the Natural Language Processing (NLP) that consisted of the high-level processing of text analysis, phonetic analysis, text normalization and morphophonemic module. The module was designed specially for SM to overcome few problems in defining the rules for SM orthography system before it can be passed to the DSP module. The second phase is the Digital Signal Processing (DSP) which operated on the low-level process of the speech waveform generation. A developed an intelligible and adequately natural sounding formant-based speech synthesis system with a light and user-friendly Graphical User Interface (GUI) is introduced. A Standard Malay Language (SM) phoneme set and an inclusive set of phone database have been constructed carefully for this phone-based speech synthesizer. By applying the generative phonology, a comprehensive letter-to-sound (LTS) rules and a pronunciation lexicon have been invented for SMaTTS. As for the evaluation tests, a set of Diagnostic Rhyme Test (DRT) word list was compiled and several experiments have been performed to evaluate the quality of the synthesized speech by analyzing the Mean Opinion Score (MOS) obtained. The overall performance of the system as well as the room for improvements was thoroughly discussed.

An Improved QRS Complex Detection for Online Medical Diagnosis

This paper presents the work of signal discrimination specifically for Electrocardiogram (ECG) waveform. ECG signal is comprised of P, QRS, and T waves in each normal heart beat to describe the pattern of heart rhythms corresponds to a specific individual. Further medical diagnosis could be done to determine any heart related disease using ECG information. The emphasis on QRS Complex classification is further discussed to illustrate the importance of it. Pan-Tompkins Algorithm, a widely known technique has been adapted to realize the QRS Complex classification process. There are eight steps involved namely sampling, normalization, low pass filter, high pass filter (build a band pass filter), derivation, squaring, averaging and lastly is the QRS detection. The simulation results obtained is represented in a Graphical User Interface (GUI) developed using MATLAB.

Protective Effect of Ethanolic Extract of Polyherbal Formulation on Carbon Tetrachloride Induced Liver Injury

Protective effect of ethanolic extract of polyherbal formulation (PHF) was studied on carbon tetrachloride induced liver damage on carbon tetrachloride induced liver damage. Treatment of rats with 250mg /kg body weight of ethanolic extract of PHF protected rats against carbon tetrachloride liver injury by significant lowerering 5’ nucleotidase (5’NT), Gamma Glutamyl transferase (GGT), Glutamate dehdyrogenasse (GDH) and Succinate Dehydrogenase (SDH) levels compared to control. Normalization in these enzyme levels indicates strong hepatoprotective property of PHF extract.

A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Normalization Discriminant Independent Component Analysis

In face recognition, feature extraction techniques attempts to search for appropriate representation of the data. However, when the feature dimension is larger than the samples size, it brings performance degradation. Hence, we propose a method called Normalization Discriminant Independent Component Analysis (NDICA). The input data will be regularized to obtain the most reliable features from the data and processed using Independent Component Analysis (ICA). The proposed method is evaluated on three face databases, Olivetti Research Ltd (ORL), Face Recognition Technology (FERET) and Face Recognition Grand Challenge (FRGC). NDICA showed it effectiveness compared with other unsupervised and supervised techniques.

A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition

Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

Systematic Study of the p, d and 3He Elastic Scattering on 6Li

the elastic scattering of protons, deuterons and 3He on 6Li at different incident energies have been analyzed in the framework of the optical model using ECIS88 as well as SPI GENOA codes. The potential parameters were extracted in the phenomenological treatment of measured by us angular distributions and literature data. A good agreement between theoretical and experimental differential cross sections was obtained in whole angular range. Parameters for real part of potential have been also calculated microscopically with singleand double-folding model for the p and d, 3He scattering, respectively, using DFPOT code. For best agreement with experiment the normalization factor N for the potential depth is obtained in the range of 0.7-0.9.

Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Rare Earth Elements in Soils of Jharia Coal Field

There are many sources trough which the soil get enriched and contaminated with REEs. The determination of REEs in environmental samples has been limited because of the lack of sensitive analytical techniques. Soil samples were collected from four sites including open cast coal mine, natural coal burning, coal washery and control in the coal field located in Dhanbad, India. Total concentrations of rare earth elements (REEs) were determined using the inductively coupled plasma atomic absorption spectrometry in order to assess enrichment status in the coal field. Results showed that the mean concentrations of La, Pr, Eu, Tb, Ho, and Tm in open cast mine and natural coal burning sites were elevated compared to the reference concentrations, while Ce, Nd, Sm, and Gd were elevated in coal washery site. When compared to reference soil, heavy REEs (HREEs) were enriched in open cast mines and natural coal burning affected soils, however, the HREEs were depleted in the coal washery sites. But, the Chondrite-normalization diagram showed significant enrichment for light REEs (LREEs) in all the soils. High concentration of Pr, Eu, Tb, Ho, Tm, and Lu in coal mining and coal burning sites may pose human health risks. Factor analysis showed that distribution and relative abundance of REEs of the coal washery site is comparable with the control. Eventually washing or cleaning of coal could significantly decrease the emission of REEs from coal into the environment.

Systholic Boolean Orthonormalizer Network in Wavelet Domain for Microarray Denoising

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on the following procedure: We apply 1) Bidimentional Discrete Wavelet Transform (DWT-2D) to the Noisy Microarray, 2) scaling and rounding to the coefficients of the highest subbands (to obtain integer and positive coefficients), 3) bit-slicing to the new highest subbands (to obtain bit-planes), 4) then we apply the Systholic Boolean Orthonormalizer Network (SBON) to the input bit-plane set and we obtain two orthonormal otput bit-plane sets (in a Boolean sense), we project a set on the other one, by means of an AND operation, and then, 5) we apply re-assembling, and, 6) rescaling. Finally, 7) we apply Inverse DWT-2D and reconstruct a microarray from the modified wavelet coefficients. Denoising results compare favorably to the most of methods in use at the moment.

Normalization and Constrained Optimization of Measures of Fuzzy Entropy

In the literature of information theory, there is necessity for comparing the different measures of fuzzy entropy and this consequently, gives rise to the need for normalizing measures of fuzzy entropy. In this paper, we have discussed this need and hence developed some normalized measures of fuzzy entropy. It is also desirable to maximize entropy and to minimize directed divergence or distance. Keeping in mind this idea, we have explained the method of optimizing different measures of fuzzy entropy.

A Robust Image Watermarking Scheme using Image Moment Normalization

Multimedia security is an incredibly significant area of concern. A number of papers on robust digital watermarking have been presented, but there are no standards that have been defined so far. Thus multimedia security is still a posing problem. The aim of this paper is to design a robust image-watermarking scheme, which can withstand a different set of attacks. The proposed scheme provides a robust solution integrating image moment normalization, content dependent watermark and discrete wavelet transformation. Moment normalization is useful to recover the watermark even in case of geometrical attacks. Content dependent watermarks are a powerful means of authentication as the data is watermarked with its own features. Discrete wavelet transforms have been used as they describe image features in a better manner. The proposed scheme finds its place in validating identification cards and financial instruments.

Phenomenological and Semi-microscopic Analysis for Elastic Scattering of Protons on 6,7Li

Analysis of the elastic scattering of protons on 6,7Li nuclei has been done in the framework of the optical model at the beam energies up to 50 MeV. Differential cross sections for the 6,7Li + p scattering were measured over the proton laboratory–energy range from 400 to 1050 keV. The elastic scattering of 6,7Li+p data at different proton incident energies have been analyzed using singlefolding model. In each case the real potential obtained from the folding model was supplemented by a phenomenological imaginary potential, and during the fitting process the real potential was normalized and the imaginary potential optimized. Normalization factor NR is calculated in the range between 0.70 and 0.84.

Using Automated Database Reverse Engineering for Database Integration

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.