Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

Pritish Kamath, Omar Montasser, Nathan Srebro

[Proceedings link] [PDF]

Subject areas: Kernel methods, PAC learning

Presented in: Session 2B, Session 2D

[Zoom link for poster in Session 2B], [Zoom link for poster in Session 2D]

Abstract: We present and study approximate notions of dimensional and margin complexity that allow for only approximating a given hypothesis class. We show that such notions are not only sufficient for learning using linear predictors or a kernel, but unlike the exact variants, are also necessary. Thus they are better suited for discussing limitations of linear/kernel methods.

Summary presentation

Full presentation