日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Statistical Properties of Kernel Principal Component Analysis

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Blanchard, G., Bousquet, O., & Zwald, L. (2006). Statistical Properties of Kernel Principal Component Analysis. Machine Learning, 66(2-3), 259-294. doi:10.1007/s10994-006-6895-9.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-D273-3
要旨
We study the properties of the eigenvalues of Gram matrices in a non-asymptotic setting. Using local Rademacher averages, we
provide data-dependent and tight bounds for their convergence towards
eigenvalues of the corresponding kernel operator. We perform these computations in a functional analytic framework which allows to deal implicitly with reproducing kernel Hilbert spaces of infinite dimension. This can
have applications to various kernel algorithms, such as Support Vector
Machines (SVM). We focus on Kernel Principal Component Analysis
(KPCA) and, using such techniques, we obtain sharp excess risk bounds
for the reconstruction error. In these bounds, the dependence on the
decay of the spectrum and on the closeness of successive eigenvalues is
made explicit.