4.5 Article

Approximate blocked Gibbs sampling for Bayesian neural networks

期刊

STATISTICS AND COMPUTING
卷 33, 期 5, 页码 -

出版社

SPRINGER
DOI: 10.1007/s11222-023-10285-5

关键词

Approximate MCMC; Bayesian inference; Bayesian neural networks; Blocked Gibbs sampling; Minibatch sampling; Posterior predictive distribution

向作者/读者索取更多资源

In this work, a blocked Gibbs sampling scheme is proposed to make minibatch MCMC sampling for feedforward neural networks more feasible. Partitioning the parameter space allows sampling regardless of layer width, and reducing the proposal variance in deeper layers can alleviate vanishing acceptance rates. The length of non-convergent chains can be increased to improve predictive accuracy in classification tasks, and avoiding vanishing acceptance rates enables longer chain runs with practical benefits. Non-convergent chain realizations are also useful in quantifying predictive uncertainty. However, performing minibatch MCMC sampling for feedforward neural networks in the presence of augmented data is still an open problem.
In this work, minibatch MCMC sampling for feedforward neural networks is made more feasible. To this end, it is proposed to sample subgroups of parameters via a blocked Gibbs sampling scheme. By partitioning the parameter space, sampling is possible irrespective of layer width. It is also possible to alleviate vanishing acceptance rates for increasing depth by reducing the proposal variance in deeper layers. Increasing the length of a non-convergent chain increases the predictive accuracy in classification tasks, so avoiding vanishing acceptance rates and consequently enabling longer chain runs have practical benefits. Moreover, non-convergent chain realizations aid in the quantification of predictive uncertainty. An open problem is how to perform minibatch MCMC sampling for feedforward neural networks in the presence of augmented data.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据