4.5 Article

The Lawson-Hanson algorithm with deviation maximization: Finite convergence and sparse recovery

期刊

出版社

WILEY
DOI: 10.1002/nla.2490

关键词

block pivoting; nonnegative least squares; sparse recovery

向作者/读者索取更多资源

This paper presents an improved version of the LHDM method, which can terminate in a finite number of steps and is applicable to a wider range of matrix categories compared to the previous version. In addition, when solving underdetermined linear systems using the NNLS method, LHDM can find sparser solutions. Extensive experiments are conducted to evaluate the performance improvement of LHDM compared to the standard Lawson-Hanson algorithm, and it is compared to several l1-minimization solvers in terms of solution quality and time-to-solution on a large set of dense instances.
The Lawson-Hanson with Deviation Maximization (LHDM) method is a block algorithm for the solution of NonNegative Least Squares (NNLS) problems. In this work we devise an improved version of LHDM and we show that it terminates in a finite number of steps, unlike the previous version, originally developed for a special class of matrices. Moreover, we are concerned with finding sparse solutions of underdetermined linear systems by means of NNLS. An extensive campaign of experiments is performed in order to evaluate the performance gain with respect to the standard Lawson-Hanson algorithm. We also show the ability of LHDM to retrieve sparse solutions, comparing it against several l1$$ {\ell}_1 $$-minimization solvers in terms of solution quality and time-to-solution on a large set of dense instances.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据