3.8 Proceedings Paper

Trimmed Robust Loss Function for Training Deep Neural Networks with Label Noise

期刊

ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I
卷 11508, 期 -, 页码 215-222

出版社

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-030-20912-4_21

关键词

Neural networks; Deep learning; Robust learning; Label noise; Categorical cross-entropy

向作者/读者索取更多资源

Deep neural networks obtain nowadays outstanding results on many vision, speech recognition and natural language processing-related tasks. Such deep structures need to be trained on very large datasets, what makes annotating the data for supervised learning, particularly difficult and time-consuming task. In the supervised datasets label noise may occur, which makes the whole training process less reliable. In this paper we present a novel robust loss function based on categorical cross-entropy. We demonstrate its robustness for several amounts of noisy labels, on popular MNIST and CIFAR-10 datasets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据