4.5 Article

Model Selection and Minimax Estimation in Generalized Linear Models

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 62, Issue 6, Pages 3721-3730

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2016.2555812

Keywords

Complexity penalty; generalized linear models; Kullback-Leibler risk; minimax estimator; model selection; sparsity

Funding

  1. Israel Science Foundation [ISF-820/13]

Ask authors/readers for more resources

We consider model selection in generalized linear models (GLM) for high-dimensional data and propose a wide class of model selection criteria based on penalized maximum likelihood with a complexity penalty on the model size. We derive a general nonasymptotic upper bound for the Kullback-Leibler risk of the resulting estimators and establish the corresponding minimax lower bounds for the sparse GLM. For the properly chosen (nonlinear) penalty, the resulting penalized maximum likelihood estimator is shown to be asymptotically minimax and adaptive to the unknown sparsity. We also discuss possible extensions of the proposed approach to model selection in the GLM under additional structural constraints and aggregation.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available