English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Causal Markov condition for submodular information measures

Steudel, B., Janzing, D., & Schölkopf, B. (2010). Causal Markov condition for submodular information measures. In A. Tauman Kalai, & M. Mohri (Eds.), 23rd Annual Conference on Learning Theory (COLT 2010) (pp. 464-476). Madison, WI, USA: OmniPress.

Item is

Files

show Files
hide Files
:
COLT2010-Steudel_[0].pdf (Any fulltext), 142KB
Name:
COLT2010-Steudel_[0].pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Creators

show
hide
 Creators:
Steudel, B, Author           
Janzing, D1, 2, Author           
Schölkopf, B1, 2, Author           
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: The causal Markov condition (CMC) is a postulate that links observations to causality. It describes the conditional independences among the observations that are entailed by a causal hypothesis in terms of a directed acyclic graph. In the conventional setting, the observations are random variables and the independence is a statistical one, i.e., the information content of observations is measured in
terms of Shannon entropy. We formulate a generalized CMC for any kind of observations on which independence is defined via an arbitrary submodular information measure. Recently, this has been discussed for observations in terms of binary strings where information is understood in the sense of Kolmogorov complexity. Our approach enables us to find computable alternatives to Kolmogorov complexity, e.g., the length of a text after applying existing data compression schemes. We show that our CMC is justified if one restricts the attention to a class of causal mechanisms that is adapted to the respective information measure. Our justification is similar to deriving the statistical CMC
from functional models of causality, where every variable is a deterministic function of its observed causes and an unobserved noise term. Our experiments on real data demonstrate the performance of compression based causal inference.

Details

show
hide
Language(s):
 Dates: 2010-06
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: 6772
 Degree: -

Event

show
hide
Title: 23rd Annual Conference on Learning Theory (COLT 2010)
Place of Event: Haifa, Israel
Start-/End Date: 2010-06-27 - 2010-06-29

Legal Case

show

Project information

show

Source 1

show
hide
Title: 23rd Annual Conference on Learning Theory (COLT 2010)
Source Genre: Proceedings
 Creator(s):
Tauman Kalai, A, Editor
Mohri, M, Editor
Affiliations:
-
Publ. Info: Madison, WI, USA : OmniPress
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 464 - 476 Identifier: ISBN: 978-0-9822529-2-5