Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Information-theoretic Metric Learning

MPG-Autoren
Es sind keine MPG-Autoren in der Publikation vorhanden
Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Davis, J., Kulis, B., Sra, S., & Dhillon, I. (2006). Information-theoretic Metric Learning. In NIPS 2006 Workshop on Learning to Compare Examples (pp. 1-5).


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-CF4B-7
Zusammenfassung
We formulate the metric learning problem as that of minimizing the differential
relative entropy between two multivariate Gaussians under constraints on the
Mahalanobis distance function. Via a surprising equivalence, we show that this
problem can be solved as a low-rank kernel learning problem. Specifically, we
minimize the Burg divergence of a low-rank kernel to an input kernel, subject to
pairwise distance constraints. Our approach has several advantages over existing
methods. First, we present a natural information-theoretic formulation for the
problem. Second, the algorithm utilizes the methods developed by Kulis et al.
[6], which do not involve any eigenvector computation; in particular, the running
time of our method is faster than most existing techniques. Third, the formulation
offers insights into connections between metric learning and kernel learning.