Journal
IEEE ACCESS
Volume 6, Issue -, Pages 69883-69906Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2018.2880454
Keywords
Sparse; low-rank; nonconvex; compressive sensing; regression; covariance matrix estimation; matrix completion; principal component analysis
Categories
Funding
- National Natural Science Foundation of China [61871265, 61401501, 61571296]
Ask authors/readers for more resources
In the past decade, sparse and low-rank recovery has drawn much attention in many areas such as signal/image processing, statistics, bioinformatics, and machine learning. To achieve sparsity and/or low-rankness inducing, the l(1) norm and nuclear norm are of the most popular regularization penalties due to their convexity. While the l(1) and nuclear norm are convenient as the related convex optimization problems are usually tractable, it has been shown in many applications that a nonconvex penalty can yield significantly better performance. In recent, nonconvex regularization-based sparse and low-rank recovery is of considerable interest and it in fact is a main driver of the recent progress in nonconvex and nonsmooth optimization. This paper gives an overview of this topic in various fields in signal processing, statistics, and machine learning, including compressive sensing, sparse regression and variable selection, sparse signals separation, sparse principal component analysis (PCA), large covariance and inverse covariance matrices estimation, matrix completion, and robust PCA. We present recent developments of nonconvex regularization based sparse and low-rank recovery in these fields, addressing the issues of penalty selection, applications and the convergence of nonconvex algorithms. Code is available at https://github.com/FWen/nereg.git.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available