4.5 Article

Challenges in Markov Chain Monte Carlo for Bayesian Neural Networks

期刊

STATISTICAL SCIENCE
卷 37, 期 3, 页码 425-442

出版社

INST MATHEMATICAL STATISTICS-IMS
DOI: 10.1214/21-STS840

关键词

Bayesian inference; Bayesian neural networks; convergence diagnostics; Markov chain Monte Carlo; posterior predictive distribution

资金

  1. Laboratory Directed Research and Development Program of Oak Ridge National Laboratory
  2. U.S. Department of Energy [DE-AC0500OR22725]

向作者/读者索取更多资源

This paper reviews the challenges of using MCMC methods in Bayesian neural networks and shows that valuable posterior predictive distributions can be obtained through MCMC sampling.
Markov chain Monte Carlo (MCMC) methods have not been broadly adopted in Bayesian neural networks (BNNs). This paper initially reviews the main challenges in sampling from the parameter posterior of a neural network via MCMC. Such challenges culminate to lack of convergence to the parameter posterior. Nevertheless, this paper shows that a nonconverged Markov chain, generated via MCMC sampling from the parameter space of a neural network, can yield via Bayesian marginalization a valuable posterior predictive distribution of the output of the neural network. Classification examples based on multilayer perceptrons showcase highly accurate posterior predictive distributions. The postulate of limited scope for MCMC developments in BNNs is partially valid; an asymptotically exact parameter posterior seems less plausible, yet an accurate posterior predictive distribution is a tenable research avenue.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据