4.0 Review

Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality: A Review

期刊

出版社

SPRINGERNATURE
DOI: 10.1007/s11633-017-1054-2

关键词

Machine learning; neural networks; deep and shallow networks; convolutional neural networks; function approximation; deep learning

资金

  1. Center for Brains, Minds and Machines (CBMM)
  2. NSF STC award CCF [1231216]
  3. ARO [W911NF-15-1-0385]

向作者/读者索取更多资源

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据