4.6 Article

Shakedrop Regularization for Deep Residual Learning

期刊

IEEE ACCESS
卷 7, 期 -, 页码 186126-186136

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2960566

关键词

Computer vision; image classification; neural networks

资金

  1. JST CREST [JPMJCR16E1]
  2. JSPS KAKENHI [JP25240028, JP17H01803, JP18J15255, JP18K19785]
  3. JST AIP PRISM [J18ZZ00418]
  4. Artificial Intelligence Research Promotion Foundation
  5. AWS Cloud Credits for Research Program

向作者/读者索取更多资源

Overfitting is a crucial problem in deep neural networks, even in the latest network architectures. In this paper, to relieve the overfitting effect of ResNet and its improvements (i.e., Wide ResNet, PyramidNet, and ResNeXt), we propose a new regularization method called ShakeDrop regularization. ShakeDrop is inspired by Shake-Shake, which is an effective regularization method, but can be applied to ResNeXt only. ShakeDrop is more effective than Shake-Shake and can be applied not only to ResNeXt but also ResNet, Wide ResNet, and PyramidNet. An important key is to achieve stability of training. Because effective regularization often causes unstable training, we introduce a training stabilizer, which is an unusual use of an existing regularizer. Through experiments under various conditions, we demonstrate the conditions under which ShakeDrop works well.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据