Support vector networks for classification and regression

Alex Smola

GMD First


The recent Support vector algorithm for pattern recognition by Cortes and Vapnik (1995) based on the theory of structural risk minimization is introduced. Support vector networks give a common framework for neural nets, polynomial classifiers and radial basis functions. The algorithm uses quadratic optimization techniques to find optimal separating hyperplanes in extremely high dimensional spaces. It is nevertheless fast in learning and has been used with great success for optical character recognition so far. We will also present results on support vector regression estimation.


Zurück zur Hompage der Cowan!