4.2 Article

Sparse Legendre expansions via l1-minimization

Journal

JOURNAL OF APPROXIMATION THEORY
Volume 164, Issue 5, Pages 517-533

Publisher

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jat.2012.01.008

Keywords

Legendre polynomials; Sparse recovery; Compressive sensing; l(1)-minimization; Condition numbers; Random matrices; Orthogonal polynomials

Categories

Funding

  1. National Science Foundation
  2. Hausdorff Center for Mathematics
  3. WWTF project SPORTS [MA 07-004]
  4. Division Of Mathematical Sciences
  5. Direct For Mathematical & Physical Scien [0902720] Funding Source: National Science Foundation

Ask authors/readers for more resources

We consider the problem of recovering polynomials that are sparse with respect to the basis of Legendre polynomials from a small number of random samples. In particular, we show that a Legendre s-sparse polynomial of maximal degree N can be recovered from m asymptotic to s log(4)(N) random samples that are chosen independently according to the Chebyshev probability measure dv(x) = pi(-1)(1 - x(2))(-1/2)dx. As an efficient recovery method, l(1)-minimization can be used. We establish these results by verifying the restricted isometry property of a preconditioned random Legendre matrix. We then extend these results to a large class of orthogonal polynomial systems, including the Jacobi polynomials, of which the Legendre polynomials are a special case. Finally, we transpose these results into the setting of approximate recovery for functions in certain infinite-dimensional function spaces. (C) 2012 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available