4.7 Article

The true sample complexity of active learning

期刊

MACHINE LEARNING
卷 80, 期 2-3, 页码 111-139

出版社

SPRINGER
DOI: 10.1007/s10994-010-5174-y

关键词

Active learning; Sample complexity; Selective sampling; Sequential design; Learning theory; Classification

向作者/读者索取更多资源

We describe and explore a new perspective on the sample complexity of active learning. In many situations where it was generally believed that active learning does not help, we show that active learning does help in the limit, often with exponential improvements in sample complexity. This contrasts with the traditional analysis of active learning problems such as non-homogeneous linear separators or depth-limited decision trees, in which Omega(1/epsilon) lower bounds are common. Such lower bounds should be interpreted carefully; indeed, we prove that it is always possible to learn an epsilon-good classifier with a number of samples asymptotically smaller than this. These new insights arise from a subtle variation on the traditional definition of sample complexity, not previously recognized in the active learning literature.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据