期刊
JOURNAL OF STATISTICAL SOFTWARE
卷 106, 期 1, 页码 1-31出版社
JOURNAL STATISTICAL SOFTWARE
DOI: 10.18637/jss.v106.i01
关键词
lasso; elastic net; ?1 penalty; regularization path; coordinate descent; generalized linear models; survival; Cox model
The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for various regression models, while Simon, Friedman, Hastie, and Tibshirani (2011) extended this work to Cox models. In this paper, the authors further extend the reach of the elastic net-regularized regression to all generalized linear model families, Cox models with right-censored data, and a simplified version of the relaxed lasso, and also discuss convenient utility functions for measuring the performance of these fitted models.
The lasso and elastic net are popular regularized regression models for supervised learning. Friedman, Hastie, and Tibshirani (2010) introduced a computationally efficient algorithm for computing the elastic net regularization path for ordinary least squares regression, logistic regression and multinomial logistic regression, while Simon, Friedman, Hastie, and Tibshirani (2011) extended this work to Cox models for right-censored data. We further extend the reach of the elastic net-regularized regression to all generalized linear model families, Cox models with (start, stop] data and strata, and a simplified version of the relaxed lasso. We also discuss convenient utility functions for measuring the performance of these fitted models.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据