Abstract: In this paper, penalized power-divergence test statistics have been defined and their exact size properties to test a nested sequence of log-linear models have been compared with ordinary power-divergence test statistics for various penalization, λ and main effect values. Since the ordinary and penalized power-divergence test statistics have the same asymptotic distribution, comparisons have been only made for small and moderate samples. Three-way contingency tables distributed according to a multinomial distribution have been considered. Simulation results reveal that penalized power-divergence test statistics perform much better than their ordinary counterparts.
Abstract: Numerous divergence measures (spectral distance, cepstral
distance, difference of the cepstral coefficients, Kullback-Leibler
divergence, distance given by the General Likelihood Ratio, distance
defined by the Recursive Bayesian Changepoint Detector and the
Mahalanobis measure) are compared in this study. The measures are
used for detection of abrupt spectral changes in synthetic AR signals
via the sliding window algorithm. Two experiments are performed;
the first is focused on detection of single boundary while the second
concentrates on detection of a couple of boundaries. Accuracy of
detection is judged for each method; the measures are compared
according to results of both experiments.