Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Regression by dependence minimization and its application to causal inference in additive noise models

Mooij, J., Janzing, D., Peters, J., & Schölkopf, B. (2009). Regression by dependence minimization and its application to causal inference in additive noise models. In A. Danyluk, L. Bottou, & M. Littman (Eds.), ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning (pp. 745-752). New York, NY, USA: ACM Press.

Item is

Externe Referenzen

einblenden:
ausblenden:
externe Referenz:
https://dl.acm.org/citation.cfm?doid=1553374.1553470 (Verlagsversion)
Beschreibung:
-
OA-Status:

Urheber

einblenden:
ausblenden:
 Urheber:
Mooij, JM1, 2, Autor           
Janzing, D1, 2, Autor           
Peters, J1, 2, Autor           
Schölkopf, B1, 2, Autor           
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Motivated by causal inference problems, we
propose a novel method for regression that
minimizes the statistical dependence between
regressors and residuals. The key advantage
of this approach to regression is that it does
not assume a particular distribution of the
noise, i.e., it is non-parametric with respect
to the noise distribution. We argue that the
proposed regression method is well suited to
the task of causal inference in additive noise
models. A practical disadvantage is that the
resulting optimization problem is generally
non-convex and can be difficult to solve. Nevertheless,
we report good results on one of the
tasks of the NIPS 2008 Causality Challenge,
where the goal is to distinguish causes from
effects in pairs of statistically dependent variables.
In addition, we propose an algorithm
for efficiently inferring causal models from
observational data for more than two variables.
The required number of regressions
and independence tests is quadratic in the
number of variables, which is a significant improvement
over the simple method that tests
all possible DAGs.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2009-06
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: DOI: 10.1145/1553374.1553470
BibTex Citekey: 5869
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 26th International Conference on Machine Learning (ICML 2009)
Veranstaltungsort: Montreal, Canada
Start-/Enddatum: 2009-06-14 - 2009-06-18

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning
Genre der Quelle: Konferenzband
 Urheber:
Danyluk, A, Herausgeber
Bottou, L, Herausgeber
Littman, M, Herausgeber
Affiliations:
-
Ort, Verlag, Ausgabe: New York, NY, USA : ACM Press
Seiten: - Band / Heft: - Artikelnummer: - Start- / Endseite: 745 - 752 Identifikator: ISBN: 978-1-60558-516-1

Quelle 2

einblenden:
ausblenden:
Titel: ACM International Conference Proceeding Series
Genre der Quelle: Reihe
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: -
Seiten: - Band / Heft: 382 Artikelnummer: - Start- / Endseite: - Identifikator: -