期刊
JOURNAL OF MATHEMATICAL IMAGING AND VISION
卷 62, 期 3, 页码 328-351出版社
SPRINGER
DOI: 10.1007/s10851-019-00922-y
关键词
Deep feedforward neural networks; Residual neural networks; Stability; Differential inclusions; Optimal control problems
类别
资金
- AFOSR [FA9550-17-1-0125]
- NSF CAREER grant [1752116]
- Direct For Mathematical & Physical Scien
- Division Of Mathematical Sciences [1752116] Funding Source: National Science Foundation
The residual neural network (ResNet) is a popular deep network architecture which has the ability to obtain high-accuracy results on several image processing problems. In order to analyze the behavior and structure of ResNet, recent work has been on establishing connections between ResNets and continuous-time optimal control problems. In this work, we show that the post-activation ResNet is related to an optimal control problem with differential inclusions and provide continuous-time stability results for the differential inclusion associated with ResNet. Motivated by the stability conditions, we show that alterations of either the architecture or the optimization problem can generate variants of ResNet which improves the theoretical stability bounds. In addition, we establish stability bounds for the full (discrete) network associated with two variants of ResNet, in particular, bounds on the growth of the features and a measure of the sensitivity of the features with respect to perturbations. These results also help to show the relationship between the depth, regularization, and stability of the feature space. Computational experiments on the proposed variants show that the accuracy of ResNet is preserved and that the accuracy seems to be monotone with respect to the depth and various corruptions.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据