Abstract: This paper presents two of the most knowing kernel
adaptive filtering (KAF) approaches, the kernel least mean squares
and the kernel recursive least squares, in order to predict a new output
of nonlinear signal processing. Both of these methods implement a
nonlinear transfer function using kernel methods in a particular space
named reproducing kernel Hilbert space (RKHS) where the model is
a linear combination of kernel functions applied to transform the
observed data from the input space to a high dimensional feature
space of vectors, this idea known as the kernel trick. Then KAF is the
developing filters in RKHS. We use two nonlinear signal processing
problems, Mackey Glass chaotic time series prediction and nonlinear
channel equalization to figure the performance of the approaches
presented and finally to result which of them is the adapted one.
Abstract: Support vector regression (SVR) has been regarded
as a state-of-the-art method for approximation and regression. The
importance of kernel function, which is so-called admissible support
vector kernel (SV kernel) in SVR, has motivated many studies
on its composition. The Gaussian kernel (RBF) is regarded as a
“best" choice of SV kernel used by non-expert in SVR, whereas
there is no evidence, except for its superior performance on some
practical applications, to prove the statement. Its well-known that
reproducing kernel (R.K) is also a SV kernel which possesses many
important properties, e.g. positive definiteness, reproducing property
and composing complex R.K by simpler ones. However, there are a
limited number of R.Ks with explicit forms and consequently few
quantitative comparison studies in practice. In this paper, two R.Ks,
i.e. SV kernels, composed by the sum and product of a translation
invariant kernel in a Sobolev space are proposed. An exploratory
study on the performance of SVR based general R.K is presented
through a systematic comparison to that of RBF using multiple
criteria and synthetic problems. The results show that the R.K is
an equivalent or even better SV kernel than RBF for the problems
with more input variables (more than 5, especially more than 10) and
higher nonlinearity.
Abstract: Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It-s well-known that reproducing kernel (R.K) is a useful kernel function which possesses many properties, e.g. positive definiteness, reproducing property and composing complex R.K by simple operation. There are two popular ways to compute the R.K with explicit form. One is to construct and solve a specific differential equation with boundary value whose handicap is incapable of obtaining a unified form of R.K. The other is using a piecewise integral of the Green function associated with a differential operator L. The latter benefits the computation of a R.K with a unified explicit form and theoretical analysis, whereas there are relatively later studies and fewer practical computations. In this paper, a new algorithm for computing a R.K is presented. It can obtain the unified explicit form of R.K in general reproducing kernel Hilbert space. It avoids constructing and solving the complex differential equations manually and benefits an automatic, flexible and rigorous computation for more general RKHS. In order to validate that the R.K computed by the algorithm can be used in SVM well, some illustrative examples and a comparison between R.K and Gaussian kernel (RBF) in support vector regression are presented. The result shows that the performance of R.K is close or slightly superior to that of RBF.