4.5 Article

Bundling classifiers by bagging trees

Journal

COMPUTATIONAL STATISTICS & DATA ANALYSIS
Volume 49, Issue 4, Pages 1068-1078

Publisher

ELSEVIER
DOI: 10.1016/j.csda.2004.06.019

Keywords

bagging; ensemble-methods; method selection; error rate estimation

Ask authors/readers for more resources

The quest of selecting the best classifier for a discriminant analysis problem is often rather difficult. A combination of different types of classifiers promises to lead to improved predictive models compared to selecting one of the competitors. An additional learning sample, for example the out-of-bag sample, is used for the training of arbitrary classifiers. Classification trees are employed to bundle their predictions for the bootstrap sample. Consequently, a combined classifier is developed. Benchmark experiments show that the combined classifier is superior to any of the single classifiers in many applications. (c) 2004 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available