4.6 Article

Elastic gradient boosting decision tree with adaptive iterations for concept drift adaptation

期刊

NEUROCOMPUTING
卷 491, 期 -, 页码 288-304

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2022.03.038

关键词

Concept drift; Ensemble learning; Data stream; Gradient boosting

资金

  1. Australian Research Council through the Discovery Project [DP190101733]

向作者/读者索取更多资源

This paper proposes a novel adaptive iterations (AdIter) method that automatically selects the number of iterations based on the severity of concept drift, in order to improve the prediction accuracy of data streams under concept drift.
As an excellent ensemble algorithm, Gradient Boosting Decision Tree (GBDT) has been tested extensively with static data. However, real-world applications often involve dynamic data streams, which suffer from concept drift problems where the data distribution changes overtime. The performance of GBDT model is degraded when applied to predict data streams with concept drift. Although incremental learning can help to alleviate such degrading, finding a perfect learning rate (i.e., the iteration in GBDT) that suits all time periods with all their different drift severity levels can be difficult. In this paper, we convert the issue of determining an optimal learning rate into the issue of choosing the best adaptive iterations when tuning GBDT. We theoretically prove that drift severity is closely related to the convergence rate of model. Accordingly, we propose a novel drift adaptation method, called adaptive iterations (AdIter), that automatically chooses the number of iterations for different drift severities to improve the prediction accuracy for data streams under concept drift. In a series of comprehensive tests with seven state-ofthe-art drift adaptation methods on both synthetic and real-world data, AdIter yielded superior accuracy levels. (c) 2022 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据