4.7 Article

Sparse Signal Recovery via Generalized Entropy Functions Minimization

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 67, Issue 5, Pages 1322-1337

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2018.2889951

Keywords

Compressive sensing; entropy functions minimization; image recovery; sparse representation classification

Funding

  1. National Science Foundation [NSF-CCF-1117545, NSF-CCF-1422995, NSF-EECS-1443936]

Ask authors/readers for more resources

Compressive sensing relies on the sparse prior imposed on the signal of interest to solve the ill-posed recovery problem in an under-determined linear system. The objective function used to enforce the sparse prior information should be both effective and easily optimizable. Motivated by the entropy concept from information theory, in this paper we propose the generalized Shannon entropy function and Renyi entropy function of the signal as the sparsity promoting regularizers. Both entropy functions are nonconvex, non-separable. Their local minimums only occur on the boundaries of the orthants in the Euclidean space. Compared to other popular objective functions, minimizing the generalized entropy functions adaptively promotes multiple high-energy coefficients while suppressing the rest low-energy coefficients. The corresponding optimization problems can be recasted into a series of reweighted l(1)-normminimization problems and then solved efficiently by adapting the FISTA. Sparse signal recovery experiments on both the simulated and real data showthat the proposed entropy functions minimization approaches perform better than other popular approaches and achieve state-of-the-art performances.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available