English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Robust Learning under Strong Noise via SQs

Anagnostides, I., Gouleakis, T., & Marashian, A. (2020). Robust Learning under Strong Noise via SQs. Retrieved from https://arxiv.org/abs/2010.09106.

Item is

Basic

show hide
Genre: Paper
Latex : Robust Learning under Strong Noise via {SQs}

Files

show Files
hide Files
:
arXiv:2010.09106.pdf (Preprint), 511KB
Name:
arXiv:2010.09106.pdf
Description:
File downloaded from arXiv at 2020-12-09 13:33
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Anagnostides, Ioannis1, Author
Gouleakis, Themis2, Author           
Marashian, Ali1, Author
Affiliations:
1External Organizations, ou_persistent22              
2Algorithms and Complexity, MPI for Informatics, Max Planck Society, ou_24019              

Content

show
hide
Free keywords: Statistics, Machine Learning, stat.ML,Computer Science, Learning, cs.LG
 Abstract: This work provides several new insights on the robustness of Kearns'
statistical query framework against challenging label-noise models. First, we
build on a recent result by \cite{DBLP:journals/corr/abs-2006-04787} that
showed noise tolerance of distribution-independently evolvable concept classes
under Massart noise. Specifically, we extend their characterization to more
general noise models, including the Tsybakov model which considerably
generalizes the Massart condition by allowing the flipping probability to be
arbitrarily close to $\frac{1}{2}$ for a subset of the domain. As a corollary,
we employ an evolutionary algorithm by \cite{DBLP:conf/colt/KanadeVV10} to
obtain the first polynomial time algorithm with arbitrarily small excess error
for learning linear threshold functions over any spherically symmetric
distribution in the presence of spherically symmetric Tsybakov noise. Moreover,
we posit access to a stronger oracle, in which for every labeled example we
additionally obtain its flipping probability. In this model, we show that every
SQ learnable class admits an efficient learning algorithm with OPT + $\epsilon$
misclassification error for a broad class of noise models. This setting
substantially generalizes the widely-studied problem of classification under
RCN with known noise rate, and corresponds to a non-convex optimization problem
even when the noise function -- i.e. the flipping probabilities of all points
-- is known in advance.

Details

show
hide
Language(s): eng - English
 Dates: 2020-10-182020
 Publication Status: Published online
 Pages: 21 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 2010.09106
BibTex Citekey: Anagnostides_arXiv2010.09106
URI: https://arxiv.org/abs/2010.09106
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show