4.7 Article

A framework for feature selection through boosting

期刊

EXPERT SYSTEMS WITH APPLICATIONS
卷 187, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.115895

关键词

Feature selection; Boosting; Ensemble learning; XGBoost

资金

  1. Netherlands Organisation for Scientific Research (NWO) [14295]
  2. Dutch Ministry of Economic Affairs (TKI Agri Food project) [12018]
  3. Breed4Food Partners Cobb Europe CRV
  4. Hendrix Genetics
  5. Topigs Norsvin

向作者/读者索取更多资源

This study introduces a novel feature selection framework based on boosting algorithm for selecting informative feature sets in classification problems. Comparative experiments on benchmark datasets show that the proposed method achieves higher accuracy with fewer features on most datasets, and the selected features exhibit lower redundancy.
As dimensions of datasets in predictive modelling continue to grow, feature selection becomes increasingly practical. Datasets with complex feature interactions and high levels of redundancy still present a challenge to existing feature selection methods. We propose a novel framework for feature selection that relies on boosting, or sample re-weighting, to select sets of informative features in classification problems. The method uses as its basis the feature rankings derived from fast and scalable tree-boosting models, such as XGBoost. We compare the proposed method to standard feature selection algorithms on 9 benchmark datasets. We show that the proposed approach reaches higher accuracies with fewer features on most of the tested datasets, and that the selected features have lower redundancy.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据