Abstract: Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., entropy, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one-class classification (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, principal component analysis (PCA), kernel principal component analysis (KPCA), and autoassociative neural network (ANN) are presented and their performance are compared. It is also shown that, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 95%.
Abstract: This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.
Abstract: The purpose of this paper is to investigate if there are positive and significant correlations between the dimensions of Person-Environment Fit (Person-Job, Person-Organization, Person-Group and Person-Supervisor) at the “Best Companies to Work for” in Brazil in 2017. For that, a quantitative approach was used with a descriptive method being defined as a research sample the "150 Best Companies to Work for", according to data base collected in 2017 and provided by Fundação Instituto of Administração (FIA) of the University of São Paulo (USP). About the data analysis procedures, asymmetry and kurtosis, factorial analysis, Kaiser-Meyer-Olkin (KMO) tests, Bartlett sphericity and Cronbach's alpha were used for the 69 research variables, and as a statistical technique for the purpose of analyzing the hypothesis, Pearson's correlation analysis was performed. As a main result, we highlight that there was a positive and significant correlation between the dimensions of Person-Environment Fit, corroborating the H1 hypothesis that there is a positive and significant correlation between Person-Job Fit, Person-Organization Fit, Person-Group Fit and Person-Supervisor Fit.
Abstract: The purpose of the present research is to equate two
test forms as part of a study to evaluate the educational effectiveness
of the ARTé: Mecenas art history learning game. The researcher
applied Item Response Theory (IRT) procedures to calculate item,
test, and mean-sigma equating parameters. With the sample size
n=134, test parameters indicated “good” model fit but low Test
Information Functions and more acute than expected equating
parameters. Therefore, the researcher applied equipercentile equating
and linear equating to raw scores and compared the equated form
parameters and effect sizes from each method. Item scaling in IRT
enables the researcher to select a subset of well-discriminating items.
The mean-sigma step produces a mean-slope adjustment from the
anchor items, which was used to scale the score on the new form
(Form R) to the reference form (Form Q) scale. In equipercentile
equating, scores are adjusted to align the proportion of scores in each
quintile segment. Linear equating produces a mean-slope adjustment,
which was applied to all core items on the new form. The study
followed a quasi-experimental design with purposeful sampling of
students enrolled in a college level art history course (n=134) and
counterbalancing design to distribute both forms on the pre- and posttests.
The Experimental Group (n=82) was asked to play ARTé:
Mecenas online and complete Level 4 of the game within a two-week
period; 37 participants completed Level 4. Over the same period, the
Control Group (n=52) did not play the game. The researcher
examined between group differences from post-test scores on test
Form Q and Form R by full-factorial Two-Way ANOVA. The raw
score analysis indicated a 1.29% direct effect of form, which was
statistically non-significant but may be practically significant. The
researcher repeated the between group differences analysis with all
three equating methods. For the IRT mean-sigma adjusted scores,
form had a direct effect of 8.39%. Mean-sigma equating with a small
sample may have resulted in inaccurate equating parameters.
Equipercentile equating aligned test means and standard deviations,
but resultant skewness and kurtosis worsened compared to raw score
parameters. Form had a 3.18% direct effect. Linear equating
produced the lowest Form effect, approaching 0%. Using linearly
equated scores, the researcher conducted an ANCOVA to examine
the effect size in terms of prior knowledge. The between group effect
size for the Control Group versus Experimental Group participants
who completed the game was 14.39% with a 4.77% effect size
attributed to pre-test score. Playing and completing the game
increased art history knowledge, and individuals with low prior
knowledge tended to gain more from pre- to post test. Ultimately,
researchers should approach test equating based on their theoretical
stance on Classical Test Theory and IRT and the respective assumptions. Regardless of the approach or method, test equating
requires a representative sample of sufficient size. With small sample
sizes, the application of a range of equating approaches can expose
item and test features for review, inform interpretation, and identify
paths for improving instruments for future study.
Abstract: A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.
Abstract: Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.
Abstract: In more complex systems, such as automotive
gearbox, a rigorous treatment of the data is necessary because there
are several moving parts (gears, bearings, shafts, etc.), and in this
way, there are several possible sources of errors and also noise. The
basic objective of this work is the detection of damage in automotive
gearbox. The detection methods used are the wavelet method, the
bispectrum; advanced filtering techniques (selective filtering) of
vibrational signals and mathematical morphology. Gearbox vibration
tests were performed (gearboxes in good condition and with defects)
of a production line of a large vehicle assembler. The vibration
signals are obtained using five accelerometers in different positions
of the sample. The results obtained using the kurtosis, bispectrum,
wavelet and mathematical morphology showed that it is possible to
identify the existence of defects in automotive gearboxes.
Abstract: In this study, a computational fluid dynamics (CFD)
model has been developed for studying the effect of surface
roughness profile on the EHL problem. The cylinders contact
geometry, meshing and calculation of the conservation of mass and
momentum equations are carried out using the commercial software
packages ICEMCFD and ANSYS Fluent. The user defined functions
(UDFs) for density, viscosity and elastic deformation of the cylinders
as the functions of pressure and temperature are defined for the CFD
model. Three different surface roughness profiles are created and
incorporated into the CFD model. It is found that the developed CFD
model can predict the characteristics of fluid flow and heat transfer in
the EHL problem, including the main parameters such as pressure
distribution, minimal film thickness, viscosity, and density changes.
The results obtained show that the pressure profile at the center of the
contact area directly relates to the roughness amplitude. A rough
surface with kurtosis value of more than 3 has greater influence over
the fluctuated shape of pressure distribution than in other cases.
Abstract: This paper presents an optimal and unsupervised satellite image segmentation approach based on Pearson system and k-Means Clustering Algorithm Initialization. Such method could be considered as original by the fact that it utilised K-Means clustering algorithm for an optimal initialisation of image class number on one hand and it exploited Pearson system for an optimal statistical distributions- affectation of each considered class on the other hand. Satellite image exploitation requires the use of different approaches, especially those founded on the unsupervised statistical segmentation principle. Such approaches necessitate definition of several parameters like image class number, class variables- estimation and generalised mixture distributions. Use of statistical images- attributes assured convincing and promoting results under the condition of having an optimal initialisation step with appropriated statistical distributions- affectation. Pearson system associated with a k-means clustering algorithm and Stochastic Expectation-Maximization 'SEM' algorithm could be adapted to such problem. For each image-s class, Pearson system attributes one distribution type according to different parameters and especially the Skewness 'β1' and the kurtosis 'β2'. The different adapted algorithms, K-Means clustering algorithm, SEM algorithm and Pearson system algorithm, are then applied to satellite image segmentation problem. Efficiency of those combined algorithms was firstly validated with the Mean Quadratic Error 'MQE' evaluation, and secondly with visual inspection along several comparisons of these unsupervised images- segmentation.
Abstract: Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows drawing conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stageby- stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.
Abstract: Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.
Abstract: Based on the one-bit-matching principle and by turning the de-mixing matrix into an orthogonal matrix via certain normalization, Ma et al proposed a one-bit-matching learning algorithm on the Stiefel manifold for independent component analysis [8]. But this algorithm is not adaptive. In this paper, an algorithm which can extract kurtosis and its sign of each independent source component directly from observation data is firstly introduced.With the algorithm , the one-bit-matching learning algorithm is revised, so that it can make the blind separation on the Stiefel manifold implemented completely in the adaptive mode in the framework of natural gradient.
Abstract: Effectiveness of Artificial Neural Networks (ANN)
and Support Vector Machines (SVM) classifiers for fault diagnosis of
rolling element bearings are presented in this paper. The
characteristic features of vibration signals of rotating driveline that
was run in its normal condition and with faults introduced were used
as input to ANN and SVM classifiers. Simple statistical features such
as standard deviation, skewness, kurtosis etc. of the time-domain
vibration signal segments along with peaks of the signal and peak of
power spectral density (PSD) are used as features to input the ANN
and SVM classifier. The effect of preprocessing of the vibration
signal by Discreet Wavelet Transform (DWT) prior to feature
extraction is also studied. It is shown from the experimental results
that the performance of SVM classifier in identification of bearing
condition is better then ANN and pre-processing of vibration signal
by DWT enhances the effectiveness of both ANN and SVM classifier
Abstract: The statistical distributions are modeled in explaining
nature of various types of data sets. Although these distributions are
mostly uni-modal, it is quite common to see multiple modes in the
observed distribution of the underlying variables, which make the
precise modeling unrealistic. The observed data do not exhibit
smoothness not necessarily due to randomness, but could also be due
to non-randomness resulting in zigzag curves, oscillations, humps
etc. The present paper argues that trigonometric functions, which
have not been used in probability functions of distributions so far,
have the potential to take care of this, if incorporated in the
distribution appropriately. A simple distribution (named as, Sinoform
Distribution), involving trigonometric functions, is illustrated in the
paper with a data set. The importance of trigonometric functions is
demonstrated in the paper, which have the characteristics to make
statistical distributions exotic. It is possible to have multiple modes,
oscillations and zigzag curves in the density, which could be suitable
to explain the underlying nature of select data set.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The goal of this work is to improve the efficiency and the reliability of the automatic artifact rejection, in particular from the Electroencephalographic (EEG) recordings. Artifact rejection is a key topic in signal processing. The artifacts are unwelcome signals that may occur during the signal acquisition and that may alter the analysis of the signals themselves. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we enhance this technique introducing the Renyi-s entropy. The performance of our method was tested exploiting the Independent Component scalp maps and it was compared to the performance of the method in literature and it showed to outperform it.
Abstract: Atrial Fibrillation is the most common sustained
arrhythmia encountered by clinicians. Because of the invisible
waveform of atrial fibrillation in atrial activation for human, it is
necessary to develop an automatic diagnosis system. 12-Lead ECG
now is available in hospital and is appropriate for using Independent
Component Analysis to estimate the AA period. In this research, we
also adopt a second-order blind identification approach to transform
the sources extracted by ICA to more precise signal and then we use
frequency domain algorithm to do the classification. In experiment,
we gather a significant result of clinical data.
Abstract: One of the primary uses of higher order statistics in
signal processing has been for detecting and estimation of non-
Gaussian signals in Gaussian noise of unknown covariance. This is
motivated by the ability of higher order statistics to suppress additive
Gaussian noise. In this paper, several methods to test for non-
Gaussianity of a given process are presented. These methods include
histogram plot, kurtosis test, and hypothesis testing using cumulants
and bispectrum of the available sequence. The hypothesis testing is
performed by constructing a statistic to test whether the bispectrum
of the given signal is non-zero. A zero bispectrum is not a proof of
Gaussianity. Hence, other tests such as the kurtosis test should be
employed. Examples are given to demonstrate the performance of the
presented methods.
Abstract: Surface metrology with image processing is a challenging task having wide applications in industry. Surface roughness can be evaluated using texture classification approach. Important aspect here is appropriate selection of features that characterize the surface. We propose an effective combination of features for multi-scale and multi-directional analysis of engineering surfaces. The features include standard deviation, kurtosis and the Canny edge detector. We apply the method by analyzing the surfaces with Discrete Wavelet Transform (DWT) and Dual-Tree Complex Wavelet Transform (DT-CWT). We used Canberra distance metric for similarity comparison between the surface classes. Our database includes the surface textures manufactured by three machining processes namely Milling, Casting and Shaping. The comparative study shows that DT-CWT outperforms DWT giving correct classification performance of 91.27% with Canberra distance metric.