English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Optimization for Machine Learning

Sra, S., Nowozin, S., & Wright, S. (Eds.). (2011). Optimization for Machine Learning. Cambridge, MA, USA: MIT Press.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-B8AC-8 Version Permalink: http://hdl.handle.net/21.11116/0000-0002-9884-D
Genre: Book

Files

show Files

Creators

show
hide
 Creators:
Sra, S1, 2, Editor              
Nowozin, S, Editor              
Wright, SJ, Editor
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Details

show
hide
Language(s):
 Dates: 2011-12
 Publication Status: Published in print
 Pages: 494
 Publishing info: Cambridge, MA, USA : MIT Press
 Table of Contents: -
 Rev. Type: -
 Identifiers: ISBN: 978-0-262-01646-9
BibTex Citekey: 6822
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Neural information processing series
Source Genre: Series
 Creator(s):
Affiliations:
Publ. Info: Cambridge, MA, USA : MIT Press
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: -