4.7 Article

A new large-scale learning algorithm for generalized additive models

期刊

MACHINE LEARNING
卷 112, 期 9, 页码 3077-3104

出版社

SPRINGER
DOI: 10.1007/s10994-023-06339-4

关键词

Additive model; Doubly stochastic gradient; Wrapper algorithm

向作者/读者索取更多资源

This paper proposes a new doubly stochastic optimization algorithm (DSGAM) for solving generalized additive models (GAM). The algorithm can scale up additive models in both sample size and dimensionality, and has been proven to have a fast convergence rate. Experimental results on large-scale benchmark datasets demonstrate the fast convergence and significant reduction in computational time compared with existing algorithms, while maintaining similar generalization performance.
Additive model plays an important role in machine learning due to its flexibility and interpretability in the prediction function. However, solving large-scale additive models is a challenging task due to several difficulties. Until now, scaling up additive models is still an open problem. To address this challenging problem, in this paper, we propose a new doubly stochastic optimization algorithm for solving the generalized additive models (DSGAM). We first propose a generalized formulation of additive models without the orthogonal hypothesis on the basis function. After that, we propose a wrapper algorithm to optimize the generalized additive models. Importantly, we introduce a doubly stochastic gradient algorithm (DSG) to solve an inner subproblem in the wrapper algorithm, which can scale well in sample size and dimensionality simultaneously. Finally, we prove the fast convergence rate of our DSGAM algorithm. The experimental results on various large-scale benchmark datasets not only confirm the fast convergence of our DSGAM algorithm, but also show a huge reduction of computational time compared with existing algorithms, while retaining the similar generalization performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据