4.7 Article

The synergy between PAV and AdaBoost

Journal

MACHINE LEARNING
Volume 61, Issue 1-3, Pages 71-103

Publisher

SPRINGER
DOI: 10.1007/s10994-005-1123-6

Keywords

boosting; isotonic regression; convergence; document classification; k nearest neighbors

Funding

  1. Intramural NIH HHS [Z99 LM999999] Funding Source: Medline

Ask authors/readers for more resources

Schapire and Singer's improved version of AdaBoost for handling weak hypotheses with confidence rated predictions represents an important advance in the theory and practice of boosting. Its success results from a more efficient use of information in weak hypotheses during updating. Instead of simple binary voting a weak hypothesis is allowed to vote for or against a classification with a variable strength or confidence. The Pool Adjacent Violators (PAV) algorithm is a method for converting a score into a probability. We show how PAV may be applied to a weak hypothesis to yield a new weak hypothesis which is in a sense an ideal confidence rated prediction and that this leads to an optimal updating for AdaBoost. The result is a new algorithm which we term PAV-AdaBoost. We give several examples illustrating problems for which this new algorithm provides advantages in performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available