4.6 Article

Grouped SMOTE With Noise Filtering Mechanism for Classifying Imbalanced Data

Journal

IEEE ACCESS
Volume 7, Issue -, Pages 170668-170681

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2955086

Keywords

Noise measurement; Classification algorithms; Filtering algorithms; Safety; Filtering; Estimation; Data models; Sampling; class imbalance learning; SMOTE; Gaussian-Mixture model; probability density

Funding

  1. Natural Science Foundation of Jiangsu Province of China [BK20191457]
  2. Open Project of Arti~cial Intelligence Key Laboratory of Sichuan Province [2019RYJ02]
  3. National Natural Science Foundation of China [61305058, 61572242]
  4. China Postdoctoral Science Foundation [2013M540404, 2015T80481]
  5. Jiangsu Planned Projects for Postdoctoral Research Funds [1401037B]
  6. Qing Lan Project of Jiangsu Province of China

Ask authors/readers for more resources

SMOTE (Synthetic Minority Oversampling TEchnique) is one of the most popular and well-known sampling algorithms for addressing class imbalance learning problem. The merits of SMOTE reflect at that in comparison with the random oversampling technique, it can alleviate the problem of overfitting to a large extent. However, two drawbacks of SMOTE have also been observed as follows, 1) it tends to propagate the noisy information in the procedure of oversampling; 2) it always assigns a global neighborhood parameter $K$ but neglects the local distribution characteristics. To synchronously deal with these two problems, a grouped SMOTE algorithm with noise filtering mechanism (GSMOTE-NFM) is presented in this article. The algorithm firstly adopts Gaussian-Mixture Model (GMM) to explore the real distributions of the majority and minority classes, respectively. Then, most noisy instances can be removed by comparing the probability densities of the same instance in two different classes. Next, two new GMMs are constructed on the rest majority and minority class instances, respectively. Furthermore, all minority class instances can be divided into three different groups: safety, boundary and outlier, based on the corresponding probability density information. Finally, we assign an individual parameter $K$ to the instances belonging to each specific group to generate new instances. We tested GSMOTE-NFM algorithm on 24 benchmark binary-class data sets with three popular classification models, and compared it with several state-of-the-art oversampling algorithms. The results indicate that our algorithm is significantly superior than the original SMOTE algorithm and several SMOTE-based modified methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available