English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Infinite Kernel Learning

Gehler, P., & Nowozin, S. (2008). Infinite Kernel Learning. In NIPS 2008 Workshop: Kernel Learning: Automatic Selection of Optimal Kernels (LK ASOK 2008) (pp. 1-4).

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-C631-7 Version Permalink: http://hdl.handle.net/21.11116/0000-0003-A0E0-A
Genre: Conference Paper

Files

show Files
hide Files
:
NIPS-LKASOK-2008-Gehler.pdf (Any fulltext), 163KB
Name:
NIPS-LKASOK-2008-Gehler.pdf
Description:
-
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Gehler, PV1, 2, Author              
Nowozin, S1, 2, Author              
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: In this paper we build upon the Multiple Kernel Learning (MKL) framework and in particular on [1] which generalized it to infinitely many kernels. We rewrite the problem in the standard MKL formulation which leads to a Semi-Infinite Program. We devise a new algorithm to solve it (Infinite Kernel Learning, IKL). The IKL algorithm is applicable to both the finite and infinite case and we find it to be faster and more stable than SimpleMKL [2]. Furthermore we present the first large scale comparison of SVMs to MKL on a variety of benchmark datasets, also comparing IKL. The results show two things: a) for many datasets there is no benefit in using MKL/IKL instead of the SVM classifier, thus the flexibility of using more than one kernel seems to be of no use, b) on some datasets IKL yields massive increases in accuracy over SVM/MKL due to the possibility of using a largely increased kernel set. For those cases parameter selection through Cross-Validation or MKL is not applicable.

Details

show
hide
Language(s):
 Dates: 2008-12
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: 5657
 Degree: -

Event

show
hide
Title: NIPS 2008 Workshop: Kernel Learning: Automatic Selection of Optimal Kernels (LK ASOK 2008)
Place of Event: Whistler, BC, Canada
Start-/End Date: 2008-12-13

Legal Case

show

Project information

show

Source 1

show
hide
Title: NIPS 2008 Workshop: Kernel Learning: Automatic Selection of Optimal Kernels (LK ASOK 2008)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1 - 4 Identifier: -