4.7 Article

A survey of class-imbalanced semi-supervised learning

期刊

MACHINE LEARNING
卷 -, 期 -, 页码 -

出版社

SPRINGER
DOI: 10.1007/s10994-023-06344-7

关键词

Deep learning; Class-imbalanced supervised learning; Semi-supervised learning; Class-imbalanced semi-supervised learning

向作者/读者索取更多资源

Semi-supervised learning (SSL) improves deep neural network performance by utilizing unlabeled data when labeled data is scarce. However, state-of-the-art SSL algorithms assume balanced class distributions between labeled and unlabeled datasets, making them ineffective for imbalanced training data. Recent research has explored methods to mitigate the degradation of semi-supervised learning models in class-imbalanced scenarios.
Semi-supervised learning(SSL) can substantially improve the performance of deep neural networks by utilizing unlabeled data when labeled data is scarce. The state-of-the-art(SOTA) semi-supervised algorithms implicitly assume that the class distribution of labeled datasets and unlabeled datasets are balanced, which means the different classes have the same numbers of training samples. However, they can hardly perform well on minority classes when the class distribution of training data is imbalanced. Recent work has found several ways to decrease the degeneration of semi-supervised learning models in class-imbalanced learning. In this article, we comprehensively review class-imbalanced semi-supervised learning (CISSL), starting with an introduction to this field, followed by a realistic evaluation of existing class-imbalanced semi-supervised learning algorithms and a brief summary of them.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据