日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

  Incremental Local Gaussian Regression

Meier, F., Hennig, P., & Schaal, S. (2014). Incremental Local Gaussian Regression. In Z., Ghahramani, M., Welling, C., Cortes, N., Lawrence, & K., Weinberger (Eds.), Advances in Neural Information Processing Systems 27 (NIPS 2014) (pp. 972-980). Curran Associates, Inc. Retrieved from http://papers.nips.cc/paper/5594-incremental-local-gaussian-regression.pdf.

Item is

基本情報

表示: 非表示:
資料種別: 会議論文

ファイル

表示: ファイル

関連URL

表示:
非表示:
説明:
-
OA-Status:

作成者

表示:
非表示:
 作成者:
Meier, Franzi1, 著者           
Hennig, P2, 著者           
Schaal, Stefan1, 著者           
所属:
1Dept. Autonomous Motion, Max Planck Institute for Intelligent Systems, Max Planck Society, ou_1497646              
2Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              

内容説明

表示:
非表示:
キーワード: Dept. Schaal; Dept. Schölkopf
 要旨: Locally weighted regression (LWR) was created as a nonparametric method that can approximate a wide range of functions, is computationally efficient, and can learn continually from very large amounts of incrementally collected data. As an interesting feature, LWR can regress on non-stationary functions, a beneficial property, for instance, in control problems. However, it does not provide a proper generative model for function values, and existing algorithms have a variety of manual tuning parameters that strongly influence bias, variance and learning speed of the results. Gaussian (process) regression, on the other hand, does provide a generative model with rather black-box automatic parameter tuning, but it has higher computational cost, especially for big data sets and if a non-stationary model is required. In this paper, we suggest a path from Gaussian (process) regression to locally weighted regression, where we retain the best of both approaches. Using a localizing function basis and approximate inference techniques, we build a Gaussian (process) regression algorithm of increasingly local nature and similar computational complexity to LWR. Empirical evaluations are performed on several synthetic and real robot datasets of increasing complexity and (big) data scale, and demonstrate that we consistently achieve on par or superior performance compared to current state-of-the-art methods while retaining a principled approach to fast incremental regression with minimal manual tuning parameters.

資料詳細

表示:
非表示:
言語:
 日付: 2014-12
 出版の状態: オンラインで出版済み
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): BibTex参照ID: NIPS2014_5594
URI: http://papers.nips.cc/paper/5594-incremental-local-gaussian-regression.pdf
 学位: -

関連イベント

表示:
非表示:
イベント名: 28th Annual Conference on Neural Information Processing Systems (NIPS 2014)
開催地: Montreal, CA
開始日・終了日: 2014-12-08 - 2014-12-13

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: Advances in Neural Information Processing Systems 27 (NIPS 2014)
種別: 会議論文集
 著者・編者:
Ghahramani, Z., 編集者
Welling, M., 編集者
Cortes, C., 編集者
Lawrence, N.D., 編集者
Weinberger, K.Q., 編集者
所属:
-
出版社, 出版地: Curran Associates, Inc.
ページ: - 巻号: - 通巻号: - 開始・終了ページ: 972 - 980 識別子(ISBN, ISSN, DOIなど): -