English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Talk

Boosting, Margins and Beyond

MPS-Authors
/persons/resource/persons84153

Rätsch,  G
Rätsch Group, Friedrich Miescher Laboratory, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Rätsch, G. (2008). Boosting, Margins and Beyond. Talk presented at Annual Belgian-Dutch Machine Learning Conference (Benelearn 2008). Spa, Belgium. 2008-05-19 - 2008-05-20.


Cite as: https://hdl.handle.net/21.11116/0000-000A-DAF6-B
Abstract
This talk will survey recent work on understanding Boosting in the context of maximizing the margin of separation. Starting with a brief introduction into Boosting in general and AdaBoost in particular, I will illustrate the connection to von Neumann's Minimax theorem and discuss AdaBoost's strategy to achieve a large margin. This will be followed by a presentation of algorithms which provably maximize the margin, are considerably quicker in maximizing the margin in practice and implement the soft- margin idea to improve the robustness against noise. In the second part I will discuss how these techniques relate to other convex optimization techniques and how they are connected to Support Vector Machines. Finally, I will talk about the effects of the different key ingredients of Boosting and lessons learned from the application of such algorithms to real world problems.