Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Conference Paper

Deterministic annealing for semi-supervised kernel machines


Chapelle,  O
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

(Any fulltext), 315KB

Supplementary Material (public)
There is no public supplementary material available

Sindhwani, V., Keerthi, S., & Chapelle, O. (2006). Deterministic annealing for semi-supervised kernel machines. In W. Cohen, & A. Moore (Eds.), ICML '06: Proceedings of the 23rd International Conference on Machine Learning (pp. 841-848). New York, NY, USA: ACM Press.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D135-A
An intuitive approach to utilizing unlabeled data in kernel-based
classification algorithms is to simply treat the unknown labels as
additional optimization variables. For margin-based loss functions,
one can view this approach as attempting to learn low-density
separators. However, this is a hard optimization problem to solve in
typical semi-supervised settings where unlabeled data is abundant.
The popular Transductive SVM algorithm is a
label-switching-retraining procedure that is known to be susceptible
to local minima. In this paper, we present a global optimization
framework for semi-supervised Kernel machines where an easier
problem is parametrically deformed to the original hard problem and
minimizers are smoothly tracked. Our approach is motivated from
deterministic annealing techniques and involves a sequence of convex
optimization problems that are exactly and efficiently solved. We
present empirical results on several synthetic and real world
datasets that demonstrate the effectiveness of our approach.