期刊
NEUROCOMPUTING
卷 500, 期 -, 页码 528-536出版社
ELSEVIER
DOI: 10.1016/j.neucom.2022.05.071
关键词
Bayesian inference; Free energy; KL divergence; Generalization loss
In this study, by theoretical derivation, we investigate the asymptotic behaviors of the generalization loss and the free energy in Bayesian inference when there are multiple optimal probability distributions, revealing differences from conventional asymptotic analysis.
Bayesian inference is a widely used statistical method. The free energy and the generalization loss, which are used to estimate the accuracy of Bayesian inference, are known to be small in singular models that do not have a unique optimal parameter. However, their characteristics have not yet been clarified when there are multiple optimal probability distributions. In this paper, we theoretically derive the asymptotic behaviors of the generalization loss and the free energy in the case that the optimal probability distributions are not unique and show that they contain asymptotically different terms from those of the conventional asymptotic analysis. (C) 2022 Published by Elsevier B.V.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据