4.6 Article

Convergence rates of general regularization methods for statistical inverse problems and applications

Journal

SIAM JOURNAL ON NUMERICAL ANALYSIS
Volume 45, Issue 6, Pages 2610-2636

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/060651884

Keywords

statistical inverse problems; iterative regularization methods; Tikhonov regularization; nonparametric regression; minimax convergence rates; satellite gradiometry; Hilbert scales; boosting; errors in variable

Ask authors/readers for more resources

Previously, the convergence analysis for linear statistical inverse problems has mainly focused on spectral cut-off and Tikhonov-type estimators. Spectral cut-off estimators achieve minimax rates for a broad range of smoothness classes and operators, but their practical usefulness is limited by the fact that they require a complete spectral decomposition of the operator. Tikhonov estimators are simpler to compute but still involve the inversion of an operator and achieve minimax rates only in restricted smoothness classes. In this paper we introduce a unifying technique to study the mean square error of a large class of regularization methods (spectral methods) including the aforementioned estimators as well as many iterative methods, such as v-methods and the Land-weber iteration. The latter estimators converge at the same rate as spectral cut-off but require only matrix-vector products. Our results are applied to various problems; in particular we obtain precise convergence rates for satellite gradiometry, L-2-boosting, and errors in variable problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available