4.6 Article

A comparative analysis of gradient boosting algorithms

期刊

ARTIFICIAL INTELLIGENCE REVIEW
卷 54, 期 3, 页码 1937-1967

出版社

SPRINGER
DOI: 10.1007/s10462-020-09896-5

关键词

XGBoost; LightGBM; CatBoost; Gradient boosting; Random forest; Ensembles of classifiers

资金

  1. European Regional Development Fund
  2. Spanish Ministry of Economy, Industry, and Competitiveness-State Research Agency [TIN2016-76406-P, PID2019-106827GB-I00 / AEI / 10.13039/501100011033]

向作者/读者索取更多资源

The family of gradient boosting algorithms has been expanded with XGBoost, LightGBM, and CatBoost, which focus on reliability, efficiency, speed, and accuracy. In the comparison study, CatBoost is the best in generalization accuracy and AUC, LightGBM is the fastest but not the most accurate, and XGBoost ranks second in accuracy and training speed.
The family of gradient boosting algorithms has been recently extended with several interesting proposals (i.e. XGBoost, LightGBM and CatBoost) that focus on both speed and accuracy. XGBoost is a scalable ensemble technique that has demonstrated to be a reliable and efficient machine learning challenge solver. LightGBM is an accurate model focused on providing extremely fast training performance using selective sampling of high gradient instances. CatBoost modifies the computation of gradients to avoid the prediction shift in order to improve the accuracy of the model. This work proposes a practical analysis of how these novel variants of gradient boosting work in terms of training speed, generalization performance and hyper-parameter setup. In addition, a comprehensive comparison between XGBoost, LightGBM, CatBoost, random forests and gradient boosting has been performed using carefully tuned models as well as using their default settings. The results of this comparison indicate that CatBoost obtains the best results in generalization accuracy and AUC in the studied datasets although the differences are small. LightGBM is the fastest of all methods but not the most accurate. Finally, XGBoost places second both in accuracy and in training speed. Finally an extensive analysis of the effect of hyper-parameter tuning in XGBoost, LightGBM and CatBoost is carried out using two novel proposed tools.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据