English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Semi-Supervised Laplacian Regularization of Kernel Canonical Correlation Analysis

MPS-Authors
/persons/resource/persons83816

Blaschko,  MB
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84037

Lampert,  CH
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83946

Gretton,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Blaschko, M., Lampert, C., & Gretton, A. (2008). Semi-Supervised Laplacian Regularization of Kernel Canonical Correlation Analysis. In W. Daelemans, B. Goethals, & K. Morik (Eds.), Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2008, Antwerp, Belgium, September 15-19, 2008 (pp. 133-145). Berlin, Germany: Springer.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C7D3-2
Abstract
Kernel canonical correlation analysis (KCCA) is a dimensionality reduction technique for paired data. By finding directions that
maximize correlation, KCCA learns representations that are more closely
tied to the underlying semantics of the data rather than noise. However,
meaningful directions are not only those that have high correlation to another
modality, but also those that capture the manifold structure of the
data. We propose a method that is simultaneously able to find highly
correlated directions that are also located on high variance directions
along the data manifold. This is achieved by the use of semi-supervised
Laplacian regularization of KCCA. We show experimentally that Laplacian
regularized training improves class separation over KCCA with only
Tikhonov regularization, while causing no degradation in the correlation
between modalities. We propose a model selection criterion based on
the Hilbert-Schmidt norm of the semi-supervised Laplacian regularized
cross-covariance operator, which we compute in closed form.