Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

Lernen mit Kernen: Support-Vektor-Methoden zur Analyse hochdimensionaler Daten

There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Schölkopf, B., Müller, K.-R., & Smola, A. (1999). Lernen mit Kernen: Support-Vektor-Methoden zur Analyse hochdimensionaler Daten. Informatik - Forschung und Entwicklung, 14(3), 154-163. doi:10.1007/s004500050135.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-E657-2
We describe recent developments and results of statistical learning theory. In the framework of learning from examples, two factors control generalization ability: explaining the training data by a learning machine of a suitable complexity. We describe kernel algorithms in feature spaces as elegant and efficient methods of realizing such machines. Examples thereof are Support Vector Machines (SVM) and Kernel PCA (Principal Component Analysis). More important than any individual example of a kernel algorithm, however, is the insight that any algorithm that can be cast in terms of dot products can be generalized to a nonlinear setting using kernels.

Finally, we illustrate the significance of kernel algorithms by briefly describing industrial and academic applications, including ones where we obtained benchmark record results.