English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Training and Approximation of a Primal Multiclass Support Vector Machine

MPS-Authors
/persons/resource/persons84331

Zien,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84118

Ong,  CS
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Zien, A., De Bona, F., & Ong, C. (2007). Training and Approximation of a Primal Multiclass Support Vector Machine. In C. Skiadas (Ed.), XIIth International Conference on Applied Stochastic Models and Data Analysis (ASMDA 2007).


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-CD6D-7
Abstract
We revisit the multiclass support vector machine (SVM) and generalize the formulation to convex loss functions and joint feature maps. Motivated by recent work [Chapelle, 2006] we use logistic loss and softmax to enable gradient based primal optimization. Kernels are incorporated via kernel principal component analysis (KPCA), which naturally leads to approximation methods for large scale problems. We investigate similarities and differences to previous multiclass SVM approaches. Experimental comparisons to previous approaches and to the popular one-vs-rest SVM are presented on several different datasets.