English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Efficient Margin Maximizing with Boosting

MPS-Authors
/persons/resource/persons84153

Rätsch,  G
Rätsch Group, Friedrich Miescher Laboratory, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Rätsch, G., & Warmuth, M. (2005). Efficient Margin Maximizing with Boosting. Journal of Machine Learning Research, 6, 2131-2152.


Cite as: https://hdl.handle.net/21.11116/0000-000A-DE46-E
Abstract
AdaBoost produces a linear combination of base hypotheses and predicts with the sign of this linear combination. The linear combination may be viewed as a hyperplane in feature space where the base hypotheses form the features. It has been observed that the generalization error of the algo- rithm continues to improve even after all examples are on the correct side of the current hyperplane. The improvement is attributed to the experimental observation that the distances (margins) of the examples to the separating hyperplane are increasing even after all examples are on the correct side. We introduce a new version of AdaBoost, called AdaBoost∗ ν, that explicitly maximizes the minimum margin of the examples up to a given precision. The algorithm incorporates a current es- timate of the achievable margin into its calculation of the linear coefficients of the base hypotheses. The bound on the number of iterations needed by the new algorithms is the same as the number needed by a known version of AdaBoost that must have an explicit estimate of the achievable mar- gin as a parameter. We also illustrate experimentally that our algorithm requires considerably fewer iterations than other algorithms that aim to maximize the margin.