4.6 Article

Extended variational inference for gamma mixture model in positive vectors modeling

Journal

NEUROCOMPUTING
Volume 432, Issue -, Pages 145-158

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2020.12.042

Keywords

Bayesian estimation; Gamma mixture model; Extended variational inference; Object detection; Image categorization

Funding

  1. Beijing Natural Science Foundation (BNSF) [4194076, 4172019, 4182018]
  2. Joint of Beijing Natural Science Foundation and Education Commission (JBNSFEC) [KZ20181000901]
  3. Shanghai Planning Office of Philosophy and Social Science [2019EGL018]
  4. Key Technologies R&D Program of He'nan Province [212102210084]
  5. National Natural Science Foundation of China (NSFC) [61802094, 71942003, 72004139]

Ask authors/readers for more resources

The paper addresses Bayesian estimation of the finite Gamma mixture model (GaMM) under the extended variational inference (EVI) framework, optimizing model performance by introducing lower-bound approximations and automatically determining the optimal mixture component number. The proposed method shows excellent performance in evaluations with synthesized and real data, demonstrating statistically significant improvements in accuracies and runtime compared to referred methods.
Bayesian estimation of finite Gamma mixture model (GaMM) has attracted considerable attention recently due to its capability of modeling positive data. With conventional variational inference (VI) frameworks, we cannot derive an analytically tractable solution for the variational posterior, since the expectation of the joint distribution of all the random variables cannot be estimated in a closed form. Therefore, numerical techniques are commonly utilized to simulate the posterior distribution. However, the optimization process of these methods can be prohibitively slow for practical applications. In order to obtain closed-form solutions, some lower-bound approximations are then introduced into the evidence lower bound (ELBO), following the recently proposed extended variational inference (EVI) framework. The problem in numerical simulation can be overcome. In this paper, we address the Bayesian estimation of the finite Gamma mixture model (GaMM) under the EVI framework in a flexible way. Moreover, the optimal mixture component number can be automatically determined based on the observed data and the over-fitting problem related to the conventional expectation-maximization (EM) is overcome. We demonstrate the excellent performance of the proposed method with synthesized data and real data evaluations. In the real data evaluation, we compare the proposed method on object detection and image categorization tasks with referred methods and find statistically significant improvement on accuracies and runtime. CO 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available