4.6 Article

Faster randomized block sparse Kaczmarz by averaging

期刊

NUMERICAL ALGORITHMS
卷 -, 期 -, 页码 -

出版社

SPRINGER
DOI: 10.1007/s11075-022-01473-x

关键词

Randomized Kaczmarz; Sparse solutions; Parallel methods

资金

  1. Projekt DEAL
  2. ITN-ETN project TraDE-OPT -
  3. European Union [861137]
  4. Marie Curie Actions (MSCA) [861137] Funding Source: Marie Curie Actions (MSCA)

向作者/读者索取更多资源

This study introduces a parallel (mini batch) version of the sparse Kaczmarz method, which computes solutions by averaging Kaczmarz steps. It is proven to have faster convergence compared to the standard method when parallel computations are utilized. Additionally, it can be viewed as a variant of other algorithms and has advantages such as estimating inconsistent systems and limiting convergence errors.
The standard randomized sparse Kaczmarz (RSK) method is an algorithm to compute sparse solutions of linear systems of equations and uses sequential updates, and thus, does not take advantage of parallel computations. In this work, we introduce a parallel (mini batch) version of RSK based on averaging several Kaczmarz steps. Naturally, this method allows for parallelization and we show that it can also leverage large overrelaxation. We prove linear expected convergence and show that, given that parallel computations can be exploited, the method provably provides faster convergence than the standard method. This method can also be viewed as a variant of the linearized Bregman algorithm, a randomized dual block coordinate descent update, a stochastic mirror descent update, or a relaxed version of RSK and we recover the standard RSK method when the batch size is equal to one. We also provide estimates for inconsistent systems and show that the iterates converges to an error in the order of the noise level. Finally, numerical examples illustrate the benefits of the new algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据