4.5 Article

A variant of Rotation Forest for constructing ensemble classifiers

Journal

PATTERN ANALYSIS AND APPLICATIONS
Volume 13, Issue 1, Pages 59-77

Publisher

SPRINGER
DOI: 10.1007/s10044-009-0168-8

Keywords

Ensemble classifier; Rotation Forest; Bagging; AdaBoost; Kappa-error diagram

Funding

  1. National Natural Science Foundations of China [10531030, 60675013]
  2. National Basic Research Program of China [2007CB311002]

Ask authors/readers for more resources

Rotation Forest, an effective ensemble classifier generation technique, works by using principal component analysis (PCA) to rotate the original feature axes so that different training sets for learning base classifiers can be formed. This paper presents a variant of Rotation Forest, which can be viewed as a combination of Bagging and Rotation Forest. Bagging is used here to inject more randomness into Rotation Forest in order to increase the diversity among the ensemble membership. The experiments conducted with 33 benchmark classification data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that the proposed method generally produces ensemble classifiers with lower error than Bagging, AdaBoost and Rotation Forest. The bias-variance analysis of error performance shows that the proposed method improves the prediction error of a single classifier by reducing much more variance term than the other considered ensemble procedures. Furthermore, the results computed on the data sets with artificial classification noise indicate that the new method is more robust to noise and kappa-error diagrams are employed to investigate the diversity-accuracy patterns of the ensemble classifiers.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available