English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  The mutual information: Detecting and evaluating dependencies between variables

Steuer, R., Kurths, J., Daub, C. O., Weise, J., & Selbig, J. (2002). The mutual information: Detecting and evaluating dependencies between variables. In European Conference on Computational Biology (ECCB 2002) (pp. S231-S240).

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0014-2E79-B Version Permalink: http://hdl.handle.net/11858/00-001M-0000-0014-2E7A-9
Genre: Conference Paper

Files

show Files
hide Files
:
Steuer-2002-The mutual informati.pdf (Any fulltext), 524KB
 
File Permalink:
-
Name:
Steuer-2002-The mutual informati.pdf
Description:
-
Visibility:
Restricted (Max Planck Institute of Molecular Plant Physiology, MBMP; )
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Steuer, R.1, Author
Kurths, J.1, Author
Daub, C. O.2, Author              
Weise, J.1, Author
Selbig, J.2, Author              
Affiliations:
1External Organizations, ou_persistent22              
2BioinformaticsCRG, Cooperative Research Groups, Max Planck Institute of Molecular Plant Physiology, Max Planck Society, ou_1753315              

Content

show
hide
Free keywords: time-series expression patterns entropy
 Abstract: Motivation: Clustering co-expressed genes usually requires the definition of `distance' or `similarity' between measured datasets, the most common choices being Pearson correlation or Euclidean distance. With the size of available datasets steadily increasing, it has become feasible to consider other, more general, definitions as well. One alternative, based on information theory, is the mutual information, providing a general measure of dependencies between variables. While the use of mutual information in cluster analysis and visualization of large-scale gene expression data has been suggested previously, the earlier studies did not focus on comparing different algorithms to estimate the mutual information from finite data. Results: Here we describe and review several approaches to estimate the mutual information from finite datasets. Our findings show that the algorithms used so far may be quite substantially improved upon. In particular when dealing with small datasets, finite sample effects and other sources of potentially misleading results have to be taken into account.

Details

show
hide
Language(s): eng - English
 Dates: 2002
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: ISI: ISI:000178836800032
URI: ://000178836800032 http://bioinformatics.oxfordjournals.org/content/18/suppl_2/S231.full.pdf
 Degree: -

Event

show
hide
Title: European Conference on Computational Biology (ECCB 2002)
Place of Event: SAARBRUCKEN, GERMANY
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: European Conference on Computational Biology (ECCB 2002)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: S231 - S240 Identifier: -