日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

ポスター

Single-class Support Vector Machines

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Schölkopf, B., Williamson, R., Smola, A., & Shawe-Taylor, J. (1999). Single-class Support Vector Machines. Poster presented at Dagstuhl-Seminar 99121: Unsupervised Learning, Dagstuhl, Germany.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-E79A-2
要旨
Suppose you are given some dataset drawn from an underlying probability dis-tributionPand you want to estimate a subsetSof input space such that theprobability that a test point drawn fromPlies outside ofSis bounded by somea priori specified 0< ν≤1.We propose an algorithm to deal with this problem by trying toestimate afunctionfwhich is positive onSand negative on the complement ofS. Thefunctional form offis given by a kernel expansion in terms of a potentially smallsubset of the training data; it is regularized by controlling the length of the weightvector in an associated feature space.We can prove thatνupper bounds the fraction of outliers (training points outsideofS) and lower bounds the fraction of support vectors. Asymptotically, undersome mild condition onP, both become equalities.The algorithm is a natural extension of the support vector algorithm to the case of unlabelled data.