4.7 Article

Entropy and Confidence-Based Undersampling Boosting Random Forests for Imbalanced Problems

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2020.2964585

关键词

Entropy; Radio frequency; Boosting; Decision trees; Training; Random forests; Heuristic algorithms; Confidence; ensemble learning; entropy; imbalanced problems; random forests (RFs); undersampling

资金

  1. Natural Science Foundation of China [61672227]
  2. Shanghai Education Development Foundation
  3. Shanghai Municipal Education Commission
  4. Natural Science Foundations of China [61806078]
  5. National Major Scientific and Technological Special Project for Significant New Drugs Development [2019ZX09201004]
  6. Special Fund Project for Shanghai Informatization Development in Big Data [201901043]
  7. National Key R&D Program of China [2018YFC0910500]

向作者/读者索取更多资源

In this article, we propose a novel entropy and confidence-based undersampling boosting (ECUBoost) framework to solve imbalanced problems. The boosting-based ensemble is combined with a new undersampling method to improve the generalization performance. To avoid losing informative samples during the data preprocessing of the boosting-based ensemble, both confidence and entropy are used in ECUBoost as benchmarks to ensure the validity and structural distribution of the majority samples during the undersampling. Furthermore, different from other iterative dynamic resampling methods, ECUBoost based on confidence can be applied to algorithms without iterations such as decision trees. Meanwhile, random forests are used as base classifiers in ECUBoost. Furthermore, experimental results on both artificial data sets and KEEL data sets prove the effectiveness of the proposed method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据