English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 PreviousNext  

Released

Conference Paper

Measuring Statistical Dependence via the Mutual Information Dimension

MPS-Authors
/persons/resource/persons118777

Sugiyama,  M
Department Molecular Biology, Max Planck Institute for Developmental Biology, Max Planck Society;

/persons/resource/persons75313

Borgwardt,  KM
Department Molecular Biology, Max Planck Institute for Developmental Biology, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Sugiyama, M., & Borgwardt, K. (2013). Measuring Statistical Dependence via the Mutual Information Dimension. In F. Rossi (Ed.), Twenty-Third International Joint Conference on Artificial Intelligence (IJCAI 2013 ) (pp. 1692-1698). Menlo Park, CA, USA: AAAI Press.


Cite as: https://hdl.handle.net/21.11116/0000-000A-B39A-E
Abstract
We propose to measure statistical dependence between two random variables by the mutual information dimension (MID), and present a scalable parameter-free estimation method for this task. Supported by sound dimension theory, our method gives an effective solution to the problem of detecting interesting relationships of variables in massive data, which is nowadays a heavily studied topic in many scientific disciplines. Different from classical Pearson's correlation coefficient, MID is zero if and only if two random variables are statistically independent and is translation and scaling invariant. We experimentally show superior performance of MID in detecting various types of relationships in the presence of noise data. Moreover, we illustrate that MID can be effectively used for feature selection in regression.