4.7 Article

On group-wise lp regularization: Theory and efficient algorithms

Journal

PATTERN RECOGNITION
Volume 48, Issue 11, Pages 3728-3738

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2015.05.009

Keywords

l(p) Regularization; Convex optimization algorithms; ADMM; FISTA; Algorithmic stability; Lasso; Group Lasso; Bridge regression; Group bridge regression; Splice detection

Ask authors/readers for more resources

Following advances in compressed sensing and high-dimensional statistics, many pattern recognition methods have been developed with l(1) regularization, which promotes sparse solutions. In this work, we instead advocate the use of l(p) (2 >= p > 1) regularization in a group setting which provides a better trade-off between sparsity and algorithmic stability. We focus on the simplest case with squared loss, which is known as group bridge regression. On the theoretical side, we prove that group bridge regression is uniformly stable and thus generalizes, which is an important property of a learning method. On the computational side, we make group bridge regression more practically attractive by deriving provably convergent and computationally efficient optimization algorithms. We show that there are at least several values of p over (1,2) at which the iterative update is analytical, thus it is even suitable for large-scale settings. We demonstrate the clear advantage of group bridge regression with the proposed algorithms over other competitive alternatives on several datasets. As l(p)-regularization allows one to achieve flexibility in sparseness/denseness of the solution, we hope that the algorithms will be useful for future applications of this regularization. (C) 2015 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available