4.7 Article

Classification With the Sparse Group Lasso

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 64, Issue 2, Pages 448-463

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2015.2488586

Keywords

Algorithms; statistical learning; structured sparsity; compressed sensing

Ask authors/readers for more resources

Classification with a sparsity constraint on the solution plays a central role in many high dimensional signal processing applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or discarded. In many applications, however, this can be too restrictive. In this paper, we are interested in a less restrictive form of structured sparse feature selection: We assume that while features can be grouped according to some notion of similarity, not all features in a group need be selected for the task at hand. The Sparse Group Lasso (SGL) was proposed to solve problems of this form. The main contributions of this paper are a new procedure called Sparse Overlapping Group (SOG) lasso, an extension to the SGL to overlapping groups and theoretical sample complexity bounds for the same. We establish model selection error bounds that specializes to many other cases. We experimentally validate our proposed method on both real and toy datasets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available