English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Let the Kernel Figure it Out: Principled Learning of Pre-processing for Kernel Classifiers

Gehler, P., & Nowozin, S. (2009). Let the Kernel Figure it Out: Principled Learning of Pre-processing for Kernel Classifiers. In 2009 IEEE Conference on Computer Vision and Pattern Recognition (pp. 2836-2843). Piscataway, NJ, USA: IEEE Service Center.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-C491-0 Version Permalink: http://hdl.handle.net/21.11116/0000-0002-F8FF-8
Genre: Conference Paper

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Gehler, PV1, 2, Author              
Nowozin, S1, 2, Author              
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Most modern computer vision systems for high-level tasks, such as image classification, object recognition and segmentation, are based on learning algorithms that are able to separate discriminative information from noise. In practice, however, the typical system consists of a long pipeline of pre-processing steps, such as extraction of different kinds of features, various kinds of normalizations, feature selection, and quantization into aggregated representations such as histograms. Along this pipeline, there are many parameters to set and choices to make, and their effect on the overall system performance is a-priori unclear. In this work, we shorten the pipeline in a principled way. We move pre-processing steps into the learning system by means of kernel parameters, letting the learning algorithm decide upon suitable parameter values. Learning to optimize the pre-processing choices becomes learning the kernel parameters. We realize this paradigm by extending the recent Multiple Kernel Learning formulation from the finite case of having a fixed number of kernels which can be combined to the general infinite case where each possible parameter setting induces an associated kernel. We evaluate the new paradigm extensively on image classification and object classification tasks. We show that it is possible to learn optimal discriminative codebooks and optimal spatial pyramid schemes, consistently outperforming all previous state-of-the-art approaches.

Details

show
hide
Language(s):
 Dates: 2009-06
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1109/CVPRW.2009.5206592
BibTex Citekey: 5829
 Degree: -

Event

show
hide
Title: IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Place of Event: Miami Beach, FL, USA
Start-/End Date: 2009-06-20 - 2009-06-25

Legal Case

show

Project information

show

Source 1

show
hide
Title: 2009 IEEE Conference on Computer Vision and Pattern Recognition
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: Piscataway, NJ, USA : IEEE Service Center
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 2836 - 2843 Identifier: ISBN: 978-1-4244-3991-1