English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers

MPS-Authors
/persons/resource/persons83901

Elisseeff,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Guermeur, J., Elisseeff, A., & Zelus, D. (2005). A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers. Applied Stochastic Models in Business and Industry, 21(2), 199-214. doi:10.1002/asmb.534.


Cite as: https://hdl.handle.net/21.11116/0000-0004-DC40-C
Abstract
Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real‐valued functions). Only in recent years has multi‐class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution‐free uniform strong laws of large numbers devoted to multi‐class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi‐class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.