User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Conference Paper

Convex Perturbations for Scalable Semidefinite Programming


Sra,  S
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Ressource
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Kulis, B., Sra, S., & Dhillon, I. (2009). Convex Perturbations for Scalable Semidefinite Programming. In D. van Dyk, & M. Welling (Eds.), Twelfth International Conference on Artificial Intelligence and Statistics (AIStats 2009) (pp. 296-303). Cambridge, MA, USA: MIT Press.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-C53F-6
Many important machine learning problems are modeled and solved via semidefinite programs; examples include metric learning, nonlinear embedding, and certain clustering problems. Often, off-the-shelf software is invoked for the associated optimization, which can be inappropriate due to excessive computational and storage requirements. In this paper, we introduce the use of convex perturbations for solving semidefinite programs (SDPs), and for a specific perturbation we derive an algorithm that has several advantages over existing techniques: a) it is simple, requiring only a few lines of Matlab, b) it is a first-order method, and thereby scalable, and c) it can easily exploit the structure of a given SDP (e.g., when the constraint matrices are low-rank, a situation common to several machine learning SDPs). A pleasant byproduct of our method is a fast, kernelized version of the large-margin nearest neighbor metric learning algorithm. We demonstrate that our algorithm is effective in finding fast approximations to large-scale SDPs arising in some machine learning applications.