4.6 Article

NUCLEAR-NORM PENALIZATION AND OPTIMAL RATES FOR NOISY LOW-RANK MATRIX COMPLETION

Journal

ANNALS OF STATISTICS
Volume 39, Issue 5, Pages 2302-2329

Publisher

INST MATHEMATICAL STATISTICS
DOI: 10.1214/11-AOS894

Keywords

Matrix completion; low-rank matrix estimation; recovery of the rank; statistical learning; optimal rate of convergence; noncommutative Bernstein inequality; Lasso

Funding

  1. NSF [DMS-09-06880, CCF-0808863, DMS-11-06644]
  2. Simons foundation [209842]
  3. ANR
  4. PASCAL-2 Network of Excellence
  5. Division Of Mathematical Sciences
  6. Direct For Mathematical & Physical Scien [0906880, 1106644] Funding Source: National Science Foundation

Ask authors/readers for more resources

This paper deals with the trace regression model where n entries or linear combinations of entries of an unknown m(1) x m(2) matrix A(0) corrupted by noise are observed. We propose a new nuclear-norm penalized estimator of A(0) and establish a general sharp oracle inequality for this estimator for arbitrary values of n, m(1), m(2) under the condition of isometry in expectation. Then this method is applied to the matrix completion problem. In this case, the estimator admits a simple explicit form, and we prove that it satisfies oracle inequalities with faster rates of convergence than in the previous works. They are valid, in particular, in the high-dimensional setting m(1)m(2) >> n. We show that the obtained rates are optimal up to logarithmic factors in a minimax sense and also derive, for any fixed matrix A(0), a nonminimax lower bound on the rate of convergence of our estimator, which coincides with the upper bound up to a constant factor. Finally, we show that our procedure provides an exact recovery of the rank of A(0) with probability close to 1. We also discuss the statistical learning setting where there is no underlying model determined by A(0), and the aim is to find the best trace regression model approximating the data. As a by-product, we show that, under the restricted eigenvalue condition, the usual vector Lasso estimator satisfies a sharp oracle inequality (i.e., an oracle inequality with leading constant 1).

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available