English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  A scalable trust-region algorithm with application to mixed-norm regression

Kim, D., Sra, S., & Dhillon, I. (2010). A scalable trust-region algorithm with application to mixed-norm regression. In J. Fürnkranz, & T. Joachims (Eds.), 27th International Conference on Machine Learning (ICML 2010) (pp. 519-526). Madison, WI, USA: Omnipress.

Item is

Files

show Files
hide Files
:
ICML2010-Kim_6519[0].pdf (Any fulltext), 333KB
Name:
ICML2010-Kim_6519[0].pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Creators

show
hide
 Creators:
Kim, D, Author
Sra, S1, 2, Author           
Dhillon, I, Author
Affiliations:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: We present a new algorithm for minimizing a convex loss-function subject to regularization. Our framework applies to numerous problems in machine learning and statistics; notably, for sparsity-promoting regularizers such as ℓ1 or ℓ1, ∞ norms, it enables efficient computation of sparse solutions. Our approach is based on the trust-region framework with nonsmooth objectives, which allows us to build on known results to provide convergence analysis. We avoid the computational overheads associated with the conventional Hessian approximation used by trust-region methods by instead using a simple separable quadratic approximation. This approximation also enables use of proximity operators for tackling nonsmooth regularizers. We illustrate the versatility of our resulting algorithm by specializing it to three mixed-norm regression problems: group lasso [36], group logistic regression [21], and multi-task lasso [19]. We experiment with both synthetic and real-world large-scale data—our method is seen to be competitive, robust, and scalable.

Details

show
hide
Language(s):
 Dates: 2010-06
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: 6519
 Degree: -

Event

show
hide
Title: 27th International Conference on Machine Learning (ICML 2010)
Place of Event: Haifa, Israel
Start-/End Date: 2010-06-21 - 2010-06-24

Legal Case

show

Project information

show

Source 1

show
hide
Title: 27th International Conference on Machine Learning (ICML 2010)
Source Genre: Proceedings
 Creator(s):
Fürnkranz, J, Editor
Joachims, T, Editor
Affiliations:
-
Publ. Info: Madison, WI, USA : Omnipress
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 519 - 526 Identifier: ISBN: 978-1-605-58907-7