4.8 Article

Complexity control by gradient descent in deep networks

期刊

NATURE COMMUNICATIONS
卷 11, 期 1, 页码 -

出版社

NATURE PUBLISHING GROUP
DOI: 10.1038/s41467-020-14663-9

关键词

-

资金

  1. Center for Minds, Brains and Machines (CBMM) - NSF STC award [CCF-1231216]
  2. C-BRIC, one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program - DARPA

向作者/读者索取更多资源

Overparametrized deep networks predict well, despite the lack of an explicit complexity control during training, such as an explicit regularization term. For exponential-type loss functions, we solve this puzzle by showing an effective regularization effect of gradient descent in terms of the normalized weights that are relevant for classification.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据