4.5 Article

The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 60, Issue 2, Pages 1313-1325

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2013.2290040

Keywords

Group square-root lasso; high dimensional regression; noise level; sparse regression; square-root lasso; tuning parameter

Funding

  1. NSF [DMS-10-07444, CCF-1116447]
  2. German-Swiss Research Group FOR916 [20PA20E-134495/1]
  3. Swiss National Science Foundation (SNF) [20PA20E-134495] Funding Source: Swiss National Science Foundation (SNF)
  4. Direct For Computer & Info Scie & Enginr
  5. Division of Computing and Communication Foundations [1116447] Funding Source: National Science Foundation
  6. Division Of Mathematical Sciences
  7. Direct For Mathematical & Physical Scien [1310119] Funding Source: National Science Foundation
  8. Division Of Mathematical Sciences
  9. Direct For Mathematical & Physical Scien [1212325] Funding Source: National Science Foundation

Ask authors/readers for more resources

We introduce and study the group square-root lasso (GSRL) method for estimation in high dimensional sparse regression models with group structure. The new estimator minimizes the square root of the residual sum of squares plus a penalty term proportional to the sum of the Euclidean norms of groups of the regression parameter vector. The net advantage of the method over the existing group lasso-type procedures consists in the form of the proportionality factor used in the penalty term, which for GSRL is independent of the variance of the error terms. This is of crucial importance in models with more parameters than the sample size, when estimating the variance of the noise becomes as difficult as the original problem. We show that the GSRL estimator adapts to the unknown sparsity of the regression vector, and has the same optimal estimation and prediction accuracy as the GL estimators, under the same minimal conditions on the model. This extends the results recently established for the square-root lasso, for sparse regression without group structure. Moreover, as a new type of result for square-root lasso methods, with or without groups, we study correct pattern recovery, and show that it can be achieved under conditions similar to those needed by the lasso or group-lasso-type methods, but with a simplified tuning strategy. We implement our method via a new algorithm, with proved convergence properties, which, unlike existing methods, scales well with the dimension of the problem. Our simulation studies support strongly our theoretical findings.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available