4.5 Article

Imbalance-XGBoost: leveraging weighted and focal losses for binary label-imbalanced classification with XGBoost

期刊

PATTERN RECOGNITION LETTERS
卷 136, 期 -, 页码 190-197

出版社

ELSEVIER
DOI: 10.1016/j.patrec.2020.05.035

关键词

Imbalanced classification; XGBoost; Python package

资金

  1. National Natural Science Foundation of China [81872719, 81803337]
  2. Provincial Natural Science Foundation of Shandong Province [ZR201807090257]
  3. National Bureau of Statistics Foundation Project [2018LY79]

向作者/读者索取更多资源

The paper presents Imbalance-XGBoost, a Python package that combines the powerful XGBoost software with weighted and focal losses to tackle binary label-imbalanced classification tasks. Though a small-scale program in terms of size, the package is, to the best of our knowledge, the first of its kind which provides an integrated implementation for the two loss functions on XGBoost and brings a general-purpose extension to XGBoost for label-imbalanced scenarios. In this paper, the design and usage of the package are discussed and illustrated with examples. Furthermore, as the first- and second-order derivatives of the loss functions are essential for the implementations, the algebraic derivation is discussed and it can be deemed as a separate contribution. The performances of the methods implemented in the package are extensively evaluated on Parkinson's disease classification dataset, and multiple competitive performances are presented with the ROC and Precision-Recall (PR) curves. To further assert the superiority of the methods, the performances on four other benchmark datasets from the UCI machine learning repository are additionally reported. Given the scalable nature of XGBoost, the package has great potentials to be broadly applied to real-life binary classification tasks, which are usually of large-scale and label-imbalanced. (C) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据