A New Composition Method of Admissible Support Vector Kernel Based on Reproducing Kernel

Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It-s well-known that reproducing kernel (R.K) is a useful kernel function which possesses many properties, e.g. positive definiteness, reproducing property and composing complex R.K by simple operation. There are two popular ways to compute the R.K with explicit form. One is to construct and solve a specific differential equation with boundary value whose handicap is incapable of obtaining a unified form of R.K. The other is using a piecewise integral of the Green function associated with a differential operator L. The latter benefits the computation of a R.K with a unified explicit form and theoretical analysis, whereas there are relatively later studies and fewer practical computations. In this paper, a new algorithm for computing a R.K is presented. It can obtain the unified explicit form of R.K in general reproducing kernel Hilbert space. It avoids constructing and solving the complex differential equations manually and benefits an automatic, flexible and rigorous computation for more general RKHS. In order to validate that the R.K computed by the algorithm can be used in SVM well, some illustrative examples and a comparison between R.K and Gaussian kernel (RBF) in support vector regression are presented. The result shows that the performance of R.K is close or slightly superior to that of RBF.





References:
[1] J. Mercer, Ed., Functions of Positive and Negative Type and Their
Connection with The Theory of Integral Equations, ser. Philosophical
Transactions of the Royal Society, London, 1909, vol. A, 209.
[2] V. Vapnik, The Nature of Statistical Learning Theory. New York:
Springer-Verlag, 1995.
[3] G. Bloch, F. Lauer, G. Colin, and Y. Chamaillard, "Support vector regression
from simulation data and few experimental samples," Information
Sciences, vol. 178, pp. 3813-3827, 2008.
[4] V. Blanz, B. Sch¨olkopf, H. B¨ulthoff, and C. J. Burges, "Comparison of
view-based object recognition algorithms using realistic 3d models," in
Artificial Neural NetworksłICANN-96, C. v. d. Malsburg, W. v. Seelen,
J. C. Vorbr¨uggen, and B. Sendhoff, Eds., vol. 1112. Berlin: Springer
Lecture Notes in Computer Science, 1996.
[5] C. J. Burges and B. Sch¨olkopf, "Improving the accuracy and speed of
support vector learning machines," in Advances in Neural Information
Processing Systems, vol. 9. Cambridge, MA: MIT Press, 1997, pp.
375-381.
[6] B. Sch¨olkopf, K.-K. Sung, C. J. Burges, F. Girosi, P. Niyogi, T. Poggio,
and V. Vapnik, "Comparing support vector machines with gaussian
kernels to radial basis function classifiers," IEEE Transactions on Signal
Processing, vol. 45, pp. 2758-2765, 1997.
[7] E. E. Osuna, R. Freund, and F. Girosi, "Training support vector
machines: An application to face detection," in IEEE Conference on
Computer Vision and Pattern Recognition, 1997, pp. 130-136.
[8] W. Kienzle, G. Bakir, M. Franz, and B. Sch¨olkopf, "Face detectionłefficient
and rank deficient," in Advances in Neural Information Processing
Systems, Y. Weiss, Ed., vol. 17. MIT Press, 2005, pp. 673-680.
[9] J.-B. Gao, S. R. Gunn, and C. J. Harris, "Mean field method for the
support vector machine regression," Neurocomputing, vol. 50, pp. 391-
405, 2003.
[10] M. A. Mohandes, T. O. Halawani, S. Rehman, and A. A. Hussain, "Support
vector machines for wind speed prediction," Renewable Energy,
vol. 29, no. 6, pp. 939-947, 2004.
[11] W.-W. He, Z.-Z. Wang, and H. Jiang, "Model optimizing and feature
selecting for support vector regression in time series forecasting,"
Neurocomputing, vol. 73, no. 3, pp. 600-611, 2008.
[12] F. Pan, P. Zhu, and Y. Zhang, "Metamodel-based lightweight design of
b-pillar with twb structure via support vector regression," Computers
and Structures, vol. 88, pp. 36-44, 2010.
[13] C. J. Burges, "A tutorial on support vector machines for pattern recognition,"
Data Mining and Knowledge Discovery, vol. 2, pp. 121-167,
1998.
[14] A. J. Smola and B. Sch¨olkopf, "A tutorial on support vector regression,"
Statistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.
[15] B. E. Boser, I. M. Guyon, and V. Vapnik, "A training algorithm for
optimal margin classifiers," in Proceedings of the 5th Annual ACM
Workshop on Computational Learning Theory, D. Haussler, Ed. ACM
Press, 1992, pp. 144-152.
[16] B. Sch¨olkopf, "The kernel trick for distances," Neural Information
Process. Systems (NIPS), vol. 13, 2000.
[17] D. Anguita and G. Bozza, "The effect of quantization on support vector
machines with gaussian kernel," in Proceedings of International Joint
Conference on Neural Networks, Montreal, Canada, 2005, pp. 681-684.
[18] X.-Y. Zhang and Y.-C. Liu, "Performance analysis of support vector
machines with gauss kernel," Computer Engineering, vol. 29, no. 8, pp.
22-25, 2003.
[19] Y. Tan and J. Wang, "A support vector machine with a hybrid kernel
and minimal vapnik-chervonenkis dimension," IEEE Transactions on
Knowledge and Data Engineering, vol. 16, pp. 385-395, 2004.
[20] J.-X. Liu, J. Li, and Y.-J. Tan, "An empirical assessment on the
robustness of support vector regression with different kernels," in
Proceedings of the 4th International Conference on Machine Learning
and Cybernetics, vol. 7, Guangzhou, China, 2005.
[21] R. Opfer, "Multiscale kernels," Advances in Computational Mathematics,
vol. 25, pp. 357-380, 2006.
[22] L. Zhang, W.-D. Zhou, and L.-C. Jiao, "Wavelet support vector machine,"
IEEE Transactions on Systems, Man and Cybernetics - Part B:
Cybernetics, vol. 34, pp. 34-39, 2004.
[23] X.-G. Zhang, D. Gao, X.-G. Zhang, and S.-J. Ren, "Robust wavelat
support machines for regression estimation," International Journal Information
Technology, vol. 11, no. 9, pp. 35-46, 2005.
[24] A. J. Smola, B. Sch¨olkopf, and K.-R. M¨uller, "The connection between
regularization operators and support vector kernels," Neural Networks,
vol. 11, pp. 637-649, 1998.
[25] R. Schaback and H. Wendland, "Approximation by positive definite
kernels," in Advanced Problems in Constructive Approximation, M. D.
Buhmann and D. H. Mache, Eds. Birkh¨auser, Basel: Verlag, 2002, pp.
203-221.
[26] G. Wahba, "Support vector machines, reproducing kernel hilbert spaces
and randomized gacv," in Advances in Kernel Methods-Support Vector
Learning, B. Sch¨olkopf, C. J. Burges, and A. J. Smola, Eds. Cambridge,
England: MIT Press, 1999, pp. 69-88.
[27] L.-M. Ma and Z.-M. Wu, "Kernel based approximation in sobolev spaces
with radial basis functions," Applied Mathematics and Computation, vol.
215, p. 2229C2237, 2009.
[28] N. Aronszajn, "Theory of reproducing kernels," Transactions of the
American Mathematical Society, vol. 68, no. 3, pp. 337-404, 1950.
[29] J. Gao, C. J. Harris, and S. R. Gunn, "Support vector kernel based on
frames in function hilbert spaces," Neural Computation, vol. 13, pp.
1975-1994, 2001.
[30] R. Schaback, "A unified theory of radial basis functions native hilbert
spaces for radial basis functions ii," Journal of Computational and
Applied Mathematics, vol. 121, pp. 165-177, 2000.
[31] S. Bergman, "The approximation of functions satisfying a linear partial
differential equation," Duke Mathematics Journal, vol. 6, pp. 537-561,
1940.
[32] M.-G. Cui and Z.-X. Deng, "On the best operator of interpolation,"
Math. Numerica Sinica, vol. 8, no. 2, pp. 209-216, 1986.
[33] J. Ling and Y.-S. Li, "A new method for computing reproducing kernels,"
Northeast Math. J., vol. 14, no. 4, pp. 467-473, 1998.
[34] B. Sch¨olkopf, A. J. Smola, and K.-R. M¨uller, "Nonlinear component
analysis as a kernel eigenvalue problem," Neural Computation, vol. 10,
no. 5, pp. 1299-1319, 1998.
[35] S. Mika, G. Ratsch, B. Weston, B. Sch¨olkopf, and K.-R. M¨uller, "Fisher
discriminate analysis with kernels," in Neural Networks for Signal
Processing IX, Y. H. Hu, J. Larsen, E. Wilson, and S. Douglas, Eds.
IEEE Press, 1999, pp. 41-48.
[36] D. MacDonald and C. Fyfe, "The kernel self-organizing map," in
Proceedings of 4th International Conference on Knowledge-based Intelligent
Engineering Systems and Allied Technologies, KES 2000, R. J.
Howlett and L. C. Jain, Eds., vol. 1, 2000, pp. 317-320.
[37] A. J. Smola, B. Sch¨olkopf, and K.-R. M¨uller, "General cost functions
for support vector regression," in Proceedings of Ninth Australian Conf.
on Neural Networks, 1998, pp. 79-83.
[38] S. Bochner, Lectures on Fourier Integral. Princeton, New Jersey:
Princeton University Press, 1959.
[39] A. J. Smola, Z. L. O' va'ri, and R. C. Williamson, "Regularization with
dot-product kernels," in Advances in Neural Information Processing
Systems, T. K. Leen, T. G. Dietterich, and V. Tresp, Eds., vol. 13. MIT
Press, 2001, pp. 308-314.
[40] A. Berlinet and C. Thomas-Agnan, Reproducing Kernel Hilbert Spaces
in Probability and Statistics. Boston, Dordrecht, London: Kluwer
Academic Publishers Group, 2003.
[41] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector
Machines and Other Kernel-based Learning Methods. Cambridge, U.K:
Cambridge University Press, 2000.
[42] M.-G. Cui and Y.-Z. Lin, Nonlinear Numerical Analysis in Reproducing
Kernel Space. Commack, NY: Nova Science Publishers, Inc., 2009.
[43] M.-G. Cui, M. Zhang, and Z.-X. Deng, "Two-dimensional reproducing
kernel and surface interpolation," Journal Computational Mathematic,
vol. 4, no. 2, pp. 177-181, 1986.
[44] W. Zhang, "The construction of reproducing kernel and some approximating
problems in the reproducing kernel spaces," Ph.D. dissertation,
National University of Defense Technology, Changsha, Hunan, China,
2005.
[45] X.-J. Zhang and H. Long, "Computing reproducing kernels for wm
2 (a, b) (i)," Mathematic Numerica Sinica, vol. 30, no. 3, pp. 295-304, 2008.
[46] Y.-S. Li, "On the recurrence relations for b-splines defined by certain
l-splines," Journal of Approximation theory, vol. 43, pp. 359-369, 1985.
[47] V. E. Neagoe, "Inversion of the van der monde matrix," IEEE Signal
Processing Letters, vol. 3, no. 4, pp. 119-120, 1996.