日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Tracking a Small Set of Experts by Mixing Past Posteriors

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Bousquet, O. (2002). Tracking a Small Set of Experts by Mixing Past Posteriors. The Journal of Machine Learning Research, 3, 363-396.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-E0D7-3
要旨
In this paper, we examine on-line learning problems in which the target
concept is allowed to change over time. In each trial a master algorithm
receives predictions from a large set of n experts. Its goal is to predict
almost as well as the best sequence of such experts chosen off-line by
partitioning the training sequence into k+1 sections and then choosing
the best expert for each section. We build on methods developed by
Herbster and Warmuth and consider an open problem posed by
Freund where the experts in the best partition are from a small
pool of size m.
Since k >> m, the best expert shifts back and forth
between the experts of the small pool.
We propose algorithms that solve
this open problem by mixing the past posteriors maintained by the master
algorithm. We relate the number of bits needed for encoding the best
partition to the loss bounds of the algorithms.
Instead of paying log n for
choosing the best expert in each section we first pay log (n choose m)
bits in the bounds for identifying the pool of m experts
and then log m bits per new section.
In the bounds we also pay twice for encoding the
boundaries of the sections.