4.7 Article

OBGAN: Minority oversampling near borderline with generative adversarial networks

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 197, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2022.116694

Keywords

Class imbalance problem; Oversampling; Generative learning; Deep learning; Neural networks; Generative adversarial networks

Funding

  1. Institute of Information & communications Technology Planning & Evaluation (IITP)
  2. Korea government (MSIT) [2020-0-01441, 2019-0-01343]
  3. Artificial Intelligence Convergence Research Center (Chungnam National University)
  4. Training Key Talents in Industrial Convergence Security [2020R1F1A1075781]
  5. National Research Foundation of Korea (NRF)

Ask authors/readers for more resources

In this study, a new oversampling method called OBGAN is proposed to address the class imbalance issue by considering the relationship between minority and majority classes. OBGAN uses independent discriminators to competitively affect the training of the generator, allowing it to capture different regions of minority and majority classes while avoiding mode collapse problem.
Class imbalance is a major issue that degrades the performance of machine learning classifiers in real-world problems. Oversampling methods have been widely used to overcome this issue by generating synthetic data from minority classes. However, conventional oversampling methods often focus only on the minority class and ignore relationships between the minority and majority classes. In this study, we propose an oversampling method called minority oversampling near the borderline with a generative adversarial network (OBGAN). To consider the minority and majority classes, OBGAN employs one independent discriminator for each class. Each discriminator competitively affects the generator to be trained to capture each region of the minority and majority classes. However, the sensitivity of the generator to the discriminator of the minority class is greater than that of the majority class. Hence, the generator learns the minority class with a focus near the borderline. In addition, the architecture and loss function of OBGAN are designed to avoid the mode collapse problem, which commonly occurs in GANs trained on relatively small datasets. Experimental results, involving 21 datasets and 6 benchmark methods, reveal that OBGAN exhibits excellent performance and stability.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available