4.6 Article

Fast learning rates for plug-in classifiers

Journal

ANNALS OF STATISTICS
Volume 35, Issue 2, Pages 608-633

Publisher

INST MATHEMATICAL STATISTICS
DOI: 10.1214/009053606000001217

Keywords

classification; statistical learning; fast rates of convergence; excess risk; plug-in classifiers; minimax lower bounds

Ask authors/readers for more resources

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n(-1/2). The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n(-1), and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n-1. We establish minimax lower bounds showing that the obtained rates cannot be improved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available