期刊
SIAM JOURNAL ON SCIENTIFIC COMPUTING
卷 43, 期 5, 页码 S269-S292出版社
SIAM PUBLICATIONS
DOI: 10.1137/20M1348091
关键词
inverse problems; optimization; machine learning; regularization; sparsity; total variation
资金
- Delphi consortium
- Netherlands Organization for Scientific Research (NWO) [613.009.032]
The paper discusses an algorithm called SR3 for solving regularized least-squares problems and analyzes the conditions, errors, and numerical examples encountered during the solution process.
We consider regularized least-squares problems of the form min(x) 1/2 parallel to Ax-b parallel to 2/2+R(Lx). Recently, Zheng et al. [IEEE Access, 7 (2019), pp. 1404-1423], proposed an algorithm called Sparse Relaxed Regularized Regression (SR3) that employs a splitting strategy by introducing an auxiliary variable y and solves min(x)(,) (y) 1/2 parallel to Ax-b parallel to 2/2 + k/2 parallel to Lx - y parallel to 2/2 + R-(x). By minimizing out the variable x, we obtain an equivalent optimization problem min(y) 1/2 parallel to F(k)y - g(k)parallel to 2/2 + R(Y). In our work, we view the SR3 method as a way to approximately solve the regularized problem. We analyze the conditioning of the relaxed problem in general and give an expression for the SVD of F-k as a function of k. Furthermore, we relate the Pareto curve of the original problem to the relaxed problem, and we quantify the error incurred by relaxation in terms of k. Finally, we propose an efficient iterative method for solving the relaxed problem with inexact inner iterations. Numerical examples illustrate the approach.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据