4.6 Article

Large-Margin Regularized Softmax Cross-Entropy Loss

期刊

IEEE ACCESS
卷 7, 期 -, 页码 19572-19578

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2897692

关键词

Neural networks; cross-entropy loss; large-margin regularization

资金

  1. National Natural Science Foundation of China [61563030, 61763028]
  2. Natural Science Foundation of Gansu Province, China [17JR5RA125]
  3. Hong-Liu Outstanding Youth Talents Foundation of the Lanzhou University of Technology

向作者/读者索取更多资源

Softmax cross-entropy loss with L2 regularization is commonly adopted in the machine learning and neural network community. Considering that the traditional softmax cross-entropy loss simply focuses on fitting or classifying the training data accurately but does not explicitly encourage a large decision margin for classification, some loss functions are proposed to improve the generalization performance by solving the problem. However, these loss functions enhance the difficulty of model optimization. In addition, inspired by regularized logistic regression, where the regularized term is responsible for adjusting the width of decision margin, which can be seen as an approximation of support vector machine, we proposed a large-margin regularization method for softmax cross-entropy loss. The advantages of the proposed loss are twofold as follows: the first is the generalization performance improvement, and the second is easy optimization. The experimental results on three small-sample datasets show that our regularization method achieves good performance and outperforms the existing popular regularization methods of neural networks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据