User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Conference Paper

Learning low-rank output kernels


Dinuzzo,  F
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Ressource
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Dinuzzo, F., & Fukumizu, K. (2011). Learning low-rank output kernels. In C.-N. Hsu, & W. Lee (Eds.), Asian Conference on Machine Learning, 14-15 November 2011, South Garden Hotels and Resorts, Taoyuan, Taiwain (pp. 181-196). Cambridge, MA, USA: JMLR.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B922-E
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels.