Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers

MPG-Autoren
/persons/resource/persons83901

Elisseeff,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Guermeur, J., Elisseeff, A., & Zelus, D. (2005). A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers. Applied Stochastic Models in Business and Industry, 21(2), 199-214. doi:10.1002/asmb.534.


Zitierlink: https://hdl.handle.net/21.11116/0000-0004-DC40-C
Zusammenfassung
Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real‐valued functions). Only in recent years has multi‐class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution‐free uniform strong laws of large numbers devoted to multi‐class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi‐class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.