Journal
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING
Volume 33, Issue 5, Pages 1988-2001Publisher
IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2019.2951556
Keywords
Boosting; Bagging; Training; Machine learning algorithms; Measurement; Standards; Sampling methods; Class imbalance learning; oversampling; ensembles learning; missing data imputation
Categories
Funding
- Natural Sciences and Engineering Research Council of Canada (NSERC) [401226689]
Ask authors/readers for more resources
The correct classification of rare samples is crucial and this article proposes novel oversampling strategies based on imputation methods to address this issue. The techniques are designed to generate synthetic minority class samples and outperform other methods according to performance metrics such as AUC, F-measure, and G-mean.
Correct classification of rare samples is a vital data mining task and of paramount importance in many research domains. This article mainly focuses on the development of the novel class-imbalance learning techniques, which make use of oversampling methods integrated with bagging and boosting ensembles. Two novel oversampling strategies based on the single and the multiple imputation methods are proposed. The proposed techniques aim to create useful synthetic minority class samples, similar to the original minority class samples, by estimation of missing values that are already induced in the minority class samples. The re-balanced datasets are then used to train base-learners of the ensemble algorithms. In addition, the proposed techniques are compared with the commonly used class imbalance learning methods in terms of three performance metrics including AUC, F-measure, and G-mean over several synthetic binary class datasets. The empirical results show that the proposed multiple imputation-based oversampling combined with bagging significantly outperforms other competitors.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available