4.6 Article

Extended variational inference for gamma mixture model in positive vectors modeling

期刊

NEUROCOMPUTING
卷 432, 期 -, 页码 145-158

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2020.12.042

关键词

Bayesian estimation; Gamma mixture model; Extended variational inference; Object detection; Image categorization

资金

  1. Beijing Natural Science Foundation (BNSF) [4194076, 4172019, 4182018]
  2. Joint of Beijing Natural Science Foundation and Education Commission (JBNSFEC) [KZ20181000901]
  3. Shanghai Planning Office of Philosophy and Social Science [2019EGL018]
  4. Key Technologies R&D Program of He'nan Province [212102210084]
  5. National Natural Science Foundation of China (NSFC) [61802094, 71942003, 72004139]

向作者/读者索取更多资源

The paper addresses Bayesian estimation of the finite Gamma mixture model (GaMM) under the extended variational inference (EVI) framework, optimizing model performance by introducing lower-bound approximations and automatically determining the optimal mixture component number. The proposed method shows excellent performance in evaluations with synthesized and real data, demonstrating statistically significant improvements in accuracies and runtime compared to referred methods.
Bayesian estimation of finite Gamma mixture model (GaMM) has attracted considerable attention recently due to its capability of modeling positive data. With conventional variational inference (VI) frameworks, we cannot derive an analytically tractable solution for the variational posterior, since the expectation of the joint distribution of all the random variables cannot be estimated in a closed form. Therefore, numerical techniques are commonly utilized to simulate the posterior distribution. However, the optimization process of these methods can be prohibitively slow for practical applications. In order to obtain closed-form solutions, some lower-bound approximations are then introduced into the evidence lower bound (ELBO), following the recently proposed extended variational inference (EVI) framework. The problem in numerical simulation can be overcome. In this paper, we address the Bayesian estimation of the finite Gamma mixture model (GaMM) under the EVI framework in a flexible way. Moreover, the optimal mixture component number can be automatically determined based on the observed data and the over-fitting problem related to the conventional expectation-maximization (EM) is overcome. We demonstrate the excellent performance of the proposed method with synthesized data and real data evaluations. In the real data evaluation, we compare the proposed method on object detection and image categorization tasks with referred methods and find statistically significant improvement on accuracies and runtime. CO 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据