Abstract: Real world Speaker Identification (SI) application
differs from ideal or laboratory conditions causing perturbations that
leads to a mismatch between the training and testing environment
and degrade the performance drastically. Many strategies have been
adopted to cope with acoustical degradation; wavelet based Bayesian
marginal model is one of them. But Bayesian marginal models
cannot model the inter-scale statistical dependencies of different
wavelet scales. Simple nonlinear estimators for wavelet based
denoising assume that the wavelet coefficients in different scales are
independent in nature. However wavelet coefficients have significant
inter-scale dependency. This paper enhances this inter-scale
dependency property by a Circularly Symmetric Probability Density
Function (CS-PDF) related to the family of Spherically Invariant
Random Processes (SIRPs) in Log Gabor Wavelet (LGW) domain
and corresponding joint shrinkage estimator is derived by Maximum
a Posteriori (MAP) estimator. A framework is proposed based on
these to denoise speech signal for automatic speaker identification
problems. The robustness of the proposed framework is tested for
Text Independent Speaker Identification application on 100 speakers
of POLYCOST and 100 speakers of YOHO speech database in three
different noise environments. Experimental results show that the
proposed estimator yields a higher improvement in identification
accuracy compared to other estimators on popular Gaussian Mixture
Model (GMM) based speaker model and Mel-Frequency Cepstral
Coefficient (MFCC) features.
Abstract: A new, combinatorial model for analyzing and inter-
preting an electrocardiogram (ECG) is presented. An application of
the model is QRS peak detection. This is demonstrated with an
online algorithm, which is shown to be space as well as time efficient.
Experimental results on the MIT-BIH Arrhythmia database show that
this novel approach is promising. Further uses for this approach are
discussed, such as taking advantage of its small memory requirements
and interpreting large amounts of pre-recorded ECG data.
Abstract: In this paper, a novel corner detection method is
presented to stably extract geometrically important corners.
Intensity-based corner detectors such as the Harris corner can detect
corners in noisy environments but has inaccurate corner position and
misses the corners of obtuse angles. Edge-based corner detectors such
as Curvature Scale Space can detect structural corners but show
unstable corner detection due to incomplete edge detection in noisy
environments. The proposed image-based direct curvature estimation
can overcome limitations in both inaccurate structural corner detection
of the Harris corner detector (intensity-based) and the unstable corner
detection of Curvature Scale Space caused by incomplete edge
detection. Various experimental results validate the robustness of the
proposed method.
Abstract: The recognition of human faces, especially those with
different orientations is a challenging and important problem in image
analysis and classification. This paper proposes an effective scheme
for rotation invariant face recognition using Log-Polar Transform and
Discrete Cosine Transform combined features. The rotation invariant
feature extraction for a given face image involves applying the logpolar
transform to eliminate the rotation effect and to produce a row
shifted log-polar image. The discrete cosine transform is then applied
to eliminate the row shift effect and to generate the low-dimensional
feature vector. A PSO-based feature selection algorithm is utilized to
search the feature vector space for the optimal feature subset.
Evolution is driven by a fitness function defined in terms of
maximizing the between-class separation (scatter index).
Experimental results, based on the ORL face database using testing
data sets for images with different orientations; show that the
proposed system outperforms other face recognition methods. The
overall recognition rate for the rotated test images being 97%,
demonstrating that the extracted feature vector is an effective rotation
invariant feature set with minimal set of selected features.
Abstract: Digital watermarking is one of the techniques for
copyright protection. In this paper, a normalization-based robust
image watermarking scheme which encompasses singular value
decomposition (SVD) and discrete cosine transform (DCT)
techniques is proposed. For the proposed scheme, the host image is
first normalized to a standard form and divided into non-overlapping
image blocks. SVD is applied to each block. By concatenating the
first singular values (SV) of adjacent blocks of the normalized image,
a SV block is obtained. DCT is then carried out on the SV blocks to
produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency
band of a SVD-DCT block by imposing a particular
relationship between two pseudo-randomly selected DCT
coefficients. An adaptive frequency mask is used to adjust local
watermark embedding strength. Watermark extraction involves
mainly the inverse process. The watermark extracting method is blind
and efficient. Experimental results show that the quality degradation
of watermarked image caused by the embedded watermark is visually
transparent. Results also show that the proposed scheme is robust
against various image processing operations and geometric attacks.
Abstract: Document image processing has become an
increasingly important technology in the automation of office
documentation tasks. During document scanning, skew is inevitably
introduced into the incoming document image. Since the algorithm
for layout analysis and character recognition are generally very
sensitive to the page skew. Hence, skew detection and correction in
document images are the critical steps before layout analysis. In this
paper, a novel skew detection method is presented for binary
document images. The method considered the some selected
characters of the text which may be subjected to thinning and Hough
transform to estimate skew angle accurately. Several experiments
have been conducted on various types of documents such as
documents containing English Documents, Journals, Text-Book,
Different Languages and Document with different fonts, Documents
with different resolutions, to reveal the robustness of the proposed
method. The experimental results revealed that the proposed method
is accurate compared to the results of well-known existing methods.
Abstract: A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: The background estimation approach using a small
window median filter is presented on the bases of analyzing IR point
target, noise and clutter model. After simplifying the two-dimensional
filter, a simple method of adopting one-dimensional median filter is
illustrated to make estimations of background according to the
characteristics of IR scanning system. The adaptive threshold is used
to segment canceled image in the background. Experimental results
show that the algorithm achieved good performance and satisfy the
requirement of big size image-s real-time processing.
Abstract: Friction-stir welding has received a huge interest in the last few years. The many advantages of this promising process have led researchers to present different theoretical and experimental explanation of the process. The way to quantitatively and qualitatively control the different parameters of the friction-stir welding process has not been paved. In this study, a refined energybased model that estimates the energy generated due to friction and plastic deformation is presented. The effect of the plastic deformation at low energy levels is significant and hence a scale factor is introduced to control its effect. The predicted heat energy and the obtained maximum temperature using our model are compared to the theoretical and experimental results available in the literature and a good agreement is obtained. The model is applied to AA6000 and AA7000 series.
Abstract: Tumor classification is a key area of research in the
field of bioinformatics. Microarray technology is commonly used in
the study of disease diagnosis using gene expression levels. The
main drawback of gene expression data is that it contains thousands
of genes and a very few samples. Feature selection methods are used
to select the informative genes from the microarray. These methods
considerably improve the classification accuracy. In the proposed
method, Genetic Algorithm (GA) is used for effective feature
selection. Informative genes are identified based on the T-Statistics,
Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate
solutions of GA are obtained from top-m informative genes. The
classification accuracy of k-Nearest Neighbor (kNN) method is used
as the fitness function for GA. In this work, kNN and Support Vector
Machine (SVM) are used as the classifiers. The experimental results
show that the proposed work is suitable for effective feature
selection. With the help of the selected genes, GA-kNN method
achieves 100% accuracy in 4 datasets and GA-SVM method
achieves in 5 out of 10 datasets. The GA with kNN and SVM
methods are demonstrated to be an accurate method for microarray
based tumor classification.
Abstract: It has become crucial over the years for nations to
improve their credit scoring methods and techniques in light of the
increasing volatility of the global economy. Statistical methods or
tools have been the favoured means for this; however artificial
intelligence or soft computing based techniques are becoming
increasingly preferred due to their proficient and precise nature and
relative simplicity. This work presents a comparison between Support
Vector Machines and Artificial Neural Networks two popular soft
computing models when applied to credit scoring. Amidst the
different criteria-s that can be used for comparisons; accuracy,
computational complexity and processing times are the selected
criteria used to evaluate both models. Furthermore the German credit
scoring dataset which is a real world dataset is used to train and test
both developed models. Experimental results obtained from our study
suggest that although both soft computing models could be used with
a high degree of accuracy, Artificial Neural Networks deliver better
results than Support Vector Machines.
Abstract: In view of geological origin, formation of the shallow
gas reservoir of the Hangzhou Bay, northern Zhejiang Province,
eastern China, and original occurrence characteristics of the gassy
sand are analyzed. Generally, gassy sand in scale gas reservoirs is in
the state of residual moisture content and the approximate scope of
initial matric suction of sand ranges about from 0kPa to100kPa.
Results based on GDS triaxial tests show that the classical shear
strength formulas of unsaturated soil can not effectively describe basic
strength characteristics of gassy sand; the relationship between
apparent cohesion and matric suction of gassy sand agrees well with
the power function, which can reasonably be used to describe the
strength of gassy sand. In the stress path of gas release, shear strength
of gassy sand will increase and experimental results show the formula
proposed in this paper can effectively predict the strength increment.
When saturated strength indexes of the sand are used in engineering
design, moderate reduction should be considered.
Abstract: We fabricated the inverted-staggered etch stopper
structure oxide-based TFT and investigated the characteristics of oxide
TFT under the 400 nm wavelength light illumination. When 400 nm
light was illuminated, the threshold voltage (Vth) decreased and
subthreshold slope (SS) increased at forward sweep, while Vth and SS
were not altered when larger wavelength lights, such as 650 nm, 550
nm and 450 nm, were illuminated. At reverse sweep, the transfer curve
barely changed even under 400 nm light. Our experimental results
support that photo-induced hole carriers are captured by donor-like
interface trap and it caused the decrease of Vth and increase of SS. We
investigated the interface trap density increases proportionally to the
photo-induced hole concentration at active layer.
Abstract: Volume rendering is widely used in medical CT image
visualization. Applying 3D image visualization to diagnosis
application can require accurate volume rendering with high
resolution. Interpolation is important in medical image processing
applications such as image compression or volume resampling.
However, it can distort the original image data because of edge
blurring or blocking effects when image enhancement procedures
were applied. In this paper, we proposed adaptive tension control
method exploiting gradient information to achieve high resolution
medical image enhancement in volume visualization, where restored
images are similar to original images as much as possible. The
experimental results show that the proposed method can improve
image quality associated with the adaptive tension control efficacy.
Abstract: To provide a better understanding of fair share policies supported by current production schedulers and their impact on scheduling performance, A relative fair share policy supported in four well-known production job schedulers is evaluated in this study. The experimental results show that fair share indeed reduces heavy-demand users from dominating the system resources. However, the detailed per-user performance analysis show that some types of users may suffer unfairness under fair share, possibly due to priority mechanisms used by the current production schedulers. These users typically are not heavy-demands users but they have mixture of jobs that do not spread out.
Abstract: The objective of this paper, is to apply support vector machine (SVM) approach for the classification of cancerous and normal regions of prostate images. Three kinds of textural features are extracted and used for the analysis: parameters of the Gauss- Markov random field (GMRF), correlation function and relative entropy. Prostate images are acquired by the system consisting of a microscope, video camera and a digitizing board. Cross-validated classification over a database of 46 images is implemented to evaluate the performance. In SVM classification, sensitivity and specificity of 96.2% and 97.0% are achieved for the 32x32 pixel block sized data, respectively, with an overall accuracy of 96.6%. Classification performance is compared with artificial neural network and k-nearest neighbor classifiers. Experimental results demonstrate that the SVM approach gives the best performance.
Abstract: Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.
Abstract: The present study was provided to examine the
vortical structures generated by two inclined impinging jets with
experimental and numerical investigations. The jets are issuing with a
pitch angle α=40° into a confined quiescent fluid. The experimental
investigation on flow patterns was visualized by using olive particles
injected into the jets illuminated by Nd:Yag laser light to reveal the
finer details of the confined jets interaction. It was observed that two
counter-rotating vortex pairs (CVPs) were generated in the near
region. A numerical investigation was also performed. First, the
numerical results were validates against the experimental results and
then the numerical model was used to study the effect of section ratio
on the evolution of the CVPs. Our results show promising agreement
with experimental data, and indicate that our model has the potential
to produce useful and accurate data regarding the evolution of CVPs.
Abstract: Since dealing with high dimensional data is
computationally complex and sometimes even intractable, recently
several feature reductions methods have been developed to reduce
the dimensionality of the data in order to simplify the calculation
analysis in various applications such as text categorization, signal
processing, image retrieval, gene expressions and etc. Among feature
reduction techniques, feature selection is one the most popular
methods due to the preservation of the original features.
In this paper, we propose a new unsupervised feature selection
method which will remove redundant features from the original
feature space by the use of probability density functions of various
features. To show the effectiveness of the proposed method, popular
feature selection methods have been implemented and compared.
Experimental results on the several datasets derived from UCI
repository database, illustrate the effectiveness of our proposed
methods in comparison with the other compared methods in terms of
both classification accuracy and the number of selected features.