期刊
PATTERN RECOGNITION (GCPR 2017)
卷 10496, 期 -, 页码 281-293出版社
SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-319-66709-6_23
关键词
-
资金
- Austrian Science Fund (FWF) under the START project BIVISION [Y729]
- European Research Council under the Horizon 2020 program, ERC starting grant HOMOVIS [640156]
- Austrian Science Fund (FWF) [Y729] Funding Source: Austrian Science Fund (FWF)
In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据