日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

会議論文

The Kernel Mutual Information

MPS-Authors
/persons/resource/persons83946

Gretton,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Gretton, A., Herbrich, R., & Smola, A. (2003). The Kernel Mutual Information. In IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '03) (pp. 880-883). Piscataway, NJ, USA: IEEE.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-DCB9-6
要旨
We introduce a new contrast function, the kernel mutual information (KMI), to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised approximation of the continuous random variables. We show that the kernel generalised variance (KGV) of F. Bach and M. Jordan (see JMLR, vol.3, p.1-48, 2002) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation.