4.5 Article

Effect of Depth and Width on Local Minima in Deep Learning

期刊

NEURAL COMPUTATION
卷 31, 期 7, 页码 1462-1498

出版社

MIT PRESS
DOI: 10.1162/neco_a_01195

关键词

-

资金

  1. NSF [1523767, 1723381]
  2. AFOSR [FA9550-17-1-0165]
  3. ONR [N00014-18-1-2847]
  4. Honda Research
  5. MIT-Sensetime Alliance on AI

向作者/读者索取更多资源

In this paper, we analyze the effects of depth and width on the quality of local minima, without strong overparameterization and simplification assumptions in the literature. Without any simplification assumption, for deep nonlinear neural networks with the squared loss, we theoretically show that the quality of local minima tends to improve toward the global minimum value as depth and width increase. Furthermore, with a locally induced structure on deep nonlinear neural networks, the values of local minima of neural networks are theoretically proven to be no worse than the globally optimal values of corresponding classical machine learning models. We empirically support our theoretical observation with a synthetic data set, as well as MNIST, CIFAR-10, and SVHN data sets. When compared to previous studies with strong overparameterization assumptions, the results in this letter do not require overparameterization and instead show the gradual effects of overparameterization as consequences of general results.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据