English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Regression by dependence minimization and its application to causal inference in additive noise models

Mooij, J., Janzing, D., Peters, J., & Schölkopf, B. (2009). Regression by dependence minimization and its application to causal inference in additive noise models. In A. Danyluk, L. Bottou, & M. Littman (Eds.), ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning (pp. 745-752). New York, NY, USA: ACM Press.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-C4A3-8 Version Permalink: http://hdl.handle.net/21.11116/0000-0002-F920-1
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Mooij, JM1, 2, Author              
Janzing, D1, 2, Author              
Peters, J1, 2, Author              
Schölkopf, B1, 2, Author              
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Motivated by causal inference problems, we propose a novel method for regression that minimizes the statistical dependence between regressors and residuals. The key advantage of this approach to regression is that it does not assume a particular distribution of the noise, i.e., it is non-parametric with respect to the noise distribution. We argue that the proposed regression method is well suited to the task of causal inference in additive noise models. A practical disadvantage is that the resulting optimization problem is generally non-convex and can be difficult to solve. Nevertheless, we report good results on one of the tasks of the NIPS 2008 Causality Challenge, where the goal is to distinguish causes from effects in pairs of statistically dependent variables. In addition, we propose an algorithm for efficiently inferring causal models from observational data for more than two variables. The required number of regressions and independence tests is quadratic in the number of variables, which is a significant improvement over the simple method that tests all possible DAGs.

Details

show
hide
Language(s):
 Dates: 2009-06
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1145/1553374.1553470
BibTex Citekey: 5869
 Degree: -

Event

show
hide
Title: 26th International Conference on Machine Learning (ICML 2009)
Place of Event: Montreal, Canada
Start-/End Date: 2009-06-14 - 2009-06-18

Legal Case

show

Project information

show

Source 1

show
hide
Title: ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning
Source Genre: Proceedings
 Creator(s):
Danyluk, A, Editor
Bottou, L, Editor
Littman, M, Editor
Affiliations:
-
Publ. Info: New York, NY, USA : ACM Press
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 745 - 752 Identifier: ISBN: 978-1-60558-516-1

Source 2

show
hide
Title: ACM International Conference Proceeding Series
Source Genre: Series
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 382 Sequence Number: - Start / End Page: - Identifier: -