09/07/2020

Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

Pritish Kamath, Omar Montasser, Nathan Srebro

Keywords: Kernel methods, PAC learning

Abstract: We present and study approximate notions of dimensional and margin complexity that allow for only approximating a given hypothesis class. We show that such notions are not only sufficient for learning using linear predictors or a kernel, but unlike the exact variants, are also necessary. Thus they are better suited for discussing limitations of linear/kernel methods.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at COLT 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers