Journal
JOURNAL OF APPROXIMATION THEORY
Volume 164, Issue 5, Pages 517-533Publisher
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jat.2012.01.008
Keywords
Legendre polynomials; Sparse recovery; Compressive sensing; l(1)-minimization; Condition numbers; Random matrices; Orthogonal polynomials
Categories
Funding
- National Science Foundation
- Hausdorff Center for Mathematics
- WWTF project SPORTS [MA 07-004]
- Division Of Mathematical Sciences
- Direct For Mathematical & Physical Scien [0902720] Funding Source: National Science Foundation
Ask authors/readers for more resources
We consider the problem of recovering polynomials that are sparse with respect to the basis of Legendre polynomials from a small number of random samples. In particular, we show that a Legendre s-sparse polynomial of maximal degree N can be recovered from m asymptotic to s log(4)(N) random samples that are chosen independently according to the Chebyshev probability measure dv(x) = pi(-1)(1 - x(2))(-1/2)dx. As an efficient recovery method, l(1)-minimization can be used. We establish these results by verifying the restricted isometry property of a preconditioned random Legendre matrix. We then extend these results to a large class of orthogonal polynomial systems, including the Jacobi polynomials, of which the Legendre polynomials are a special case. Finally, we transpose these results into the setting of approximate recovery for functions in certain infinite-dimensional function spaces. (C) 2012 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available