English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Fast Kernel ICA using an Approximate Newton Method

Shen, H., Jegelka, S., & Gretton, A. (2007). Fast Kernel ICA using an Approximate Newton Method. In M. Meila, & X. shen (Eds.), Artificial Intelligence and Statistics, 21-24 March 2007, San Juan, Puerto Rico (pp. 476-483). Madison, WI, USA: International Machine Learning Society.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-CE7B-0 Version Permalink: http://hdl.handle.net/21.11116/0000-0003-E8BB-5
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Shen, H, Author              
Jegelka, S1, 2, Author              
Gretton, A1, 2, Author              
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Recent approaches to independent component analysis (ICA) have used kernel independence measures to obtain very good performance, particularly where classical methods experience difficulty (for instance, sources with near-zero kurtosis). We present Fast Kernel ICA (FastKICA), a novel optimisation technique for one such kernel independence measure, the Hilbert-Schmidt independence criterion (HSIC). Our search procedure uses an approximate Newton method on the special orthogonal group, where we estimate the Hessian locally about independence. We employ incomplete Cholesky decomposition to efficiently compute the gradient and approximate Hessian. FastKICA results in more accurate solutions at a given cost compared with gradient descent, and is relatively insensitive to local minima when initialised far from independence. These properties allow kernel approaches to be extended to problems with larger numbers of sources and observations. Our method is competitive with other modern and classical ICA approaches in both speed and accuracy.

Details

show
hide
Language(s):
 Dates: 2007-03
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: 4295
 Degree: -

Event

show
hide
Title: 11th International Conference on Artificial Intelligence and Statistics (AISTATS 2007)
Place of Event: San Juan, Puerto Rico
Start-/End Date: 2007-03-21 - 2007-03-24

Legal Case

show

Project information

show

Source 1

show
hide
Title: Artificial Intelligence and Statistics, 21-24 March 2007, San Juan, Puerto Rico
Source Genre: Proceedings
 Creator(s):
Meila, M, Editor
shen, X, Editor
Affiliations:
-
Publ. Info: Madison, WI, USA : International Machine Learning Society
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 476 - 483 Identifier: -

Source 2

show
hide
Title: JMLR Workshop and Conference Proceedings
Source Genre: Series
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 2 Sequence Number: - Start / End Page: - Identifier: -