4.7 Article

Is combining classifiers with stacking better than selecting the best one?

期刊

MACHINE LEARNING
卷 54, 期 3, 页码 255-273

出版社

SPRINGER
DOI: 10.1023/b:mach.0000015881.36452.6e

关键词

multi-response model trees; stacking; combining classifiers; ensembles of classifiers; meta-learning

向作者/读者索取更多资源

We empirically evaluate several state-of-the-art methods for constructing ensembles of heterogeneous classifiers with stacking and show that they perform (at best) comparably to selecting the best classifier from the ensemble by cross validation. Among state-of-the-art stacking methods, stacking with probability distributions and multi-response linear regression performs best. We propose two extensions of this method, one using an extended set of meta-level features and the other using multi-response model trees to learn at the meta-level. We show that the latter extension performs better than existing stacking approaches and better than selecting the best classifier by cross validation.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据