4.4 Article

Bayesian Hierarchical Stacking: Some Models Are (Somewhere) Useful*

期刊

BAYESIAN ANALYSIS
卷 17, 期 4, 页码 1043-1071

出版社

INT SOC BAYESIAN ANALYSIS
DOI: 10.1214/21-BA1287

关键词

-

资金

  1. National Science Foundation
  2. Institute of Education Sciences
  3. Office of Naval Research
  4. National Institutes of Health
  5. Sloan Foundation
  6. Schmidt Futures
  7. Academy of Finland Flagship programme: Finnish Center for Artificial Intelligence (FCAI)
  8. Slovenian Research Agency

向作者/读者索取更多资源

This article introduces a model averaging technique called Stacking, which achieves optimal predictions through linear averaging of different models. The article also proposes an improvement method using a hierarchical model and extends Stacking to Bayesian hierarchical stacking. Experimental results demonstrate that this method can achieve better performance on different types of data.
Stacking is a widely used model averaging technique that asymptotically yields optimal predictions among linear averages. We show that stacking is most effective when model predictive performance is heterogeneous in inputs, and we can further improve the stacked mixture with a hierarchical model. We generalize stacking to Bayesian hierarchical stacking. The model weights are varying as a function of data, partially-pooled, and inferred using Bayesian inference. We further incorporate discrete and continuous inputs, other structured priors, and time series and longitudinal data. To verify the performance gain of the proposed method, we derive theory bounds, and demonstrate on several applied problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据