4.6 Article

Elastic Net Regularization Paths for All Generalized Linear Models

期刊

JOURNAL OF STATISTICAL SOFTWARE
卷 106, 期 1, 页码 1-31

出版社

JOURNAL STATISTICAL SOFTWARE
DOI: 10.18637/jss.v106.i01

关键词

lasso; elastic net; ?1 penalty; regularization path; coordinate descent; generalized linear models; survival; Cox model

向作者/读者索取更多资源

The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for various regression models, while Simon, Friedman, Hastie, and Tibshirani (2011) extended this work to Cox models. In this paper, the authors further extend the reach of the elastic net-regularized regression to all generalized linear model families, Cox models with right-censored data, and a simplified version of the relaxed lasso, and also discuss convenient utility functions for measuring the performance of these fitted models.
The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for ordinary least squares regression, logistic regression and multinomial logistic regression, while Simon, Friedman, Hastie, and Tibshirani (2011) extended this work to Cox models for right-censored data. We further extend the reach of the elastic net-regularized regression to all generalized linear model families, Cox models with (start, stop] data and strata, and a simplified version of the relaxed lasso. We also discuss convenient utility functions for measuring the performance of these fitted models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据