Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise

Chicheng Zhang , Yinan Li

[Proceedings link] [PDF]

Session: Generalization and PAC-Learning 2 (B)

Session Chair: Steve Hanneke

Poster: Poster Session 4

Abstract: We give a computationally-efficient PAC active learning algorithm for $d$-dimensional homogeneous halfspaces that can tolerate Massart noise~\citep{massart2006risk} and Tsybakov~\citep{tsybakov2004optimal} noise. Specialized to the $\eta$-Massart noise setting, our algorithm achieves an information-theoretically near-optimal label complexity of $\tilde{O}\rbr{\frac{d}{(1-2\eta)^2} \polylog(\frac1\epsilon)}$ under a wide range of unlabeled data distributions (specifically, the family of ``structured distributions'' defined in~\citet{diakonikolas2020polynomial}). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our algorithm is the first to achieve computational efficiency and provide label complexity guarantees strictly lower than passive learning algorithms.

Summary presentation

Full presentation

Discussion