4.5 Article

Forward Stability of ResNet and Its Variants

期刊

JOURNAL OF MATHEMATICAL IMAGING AND VISION
卷 62, 期 3, 页码 328-351

出版社

SPRINGER
DOI: 10.1007/s10851-019-00922-y

关键词

Deep feedforward neural networks; Residual neural networks; Stability; Differential inclusions; Optimal control problems

资金

  1. AFOSR [FA9550-17-1-0125]
  2. NSF CAREER grant [1752116]
  3. Direct For Mathematical & Physical Scien
  4. Division Of Mathematical Sciences [1752116] Funding Source: National Science Foundation

向作者/读者索取更多资源

The residual neural network (ResNet) is a popular deep network architecture which has the ability to obtain high-accuracy results on several image processing problems. In order to analyze the behavior and structure of ResNet, recent work has been on establishing connections between ResNets and continuous-time optimal control problems. In this work, we show that the post-activation ResNet is related to an optimal control problem with differential inclusions and provide continuous-time stability results for the differential inclusion associated with ResNet. Motivated by the stability conditions, we show that alterations of either the architecture or the optimization problem can generate variants of ResNet which improves the theoretical stability bounds. In addition, we establish stability bounds for the full (discrete) network associated with two variants of ResNet, in particular, bounds on the growth of the features and a measure of the sensitivity of the features with respect to perturbations. These results also help to show the relationship between the depth, regularization, and stability of the feature space. Computational experiments on the proposed variants show that the accuracy of ResNet is preserved and that the accuracy seems to be monotone with respect to the depth and various corruptions.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据