ausblenden:
Schlagwörter:
-
Zusammenfassung:
Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real‐valued functions). Only in recent years has multi‐class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution‐free uniform strong laws of large numbers devoted to multi‐class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi‐class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.