4.5 Article

Variational inference and sparsity in high-dimensional deep Gaussian mixture models

Journal

STATISTICS AND COMPUTING
Volume 32, Issue 5, Pages -

Publisher

SPRINGER
DOI: 10.1007/s11222-022-10132-z

Keywords

Deep clustering; High-dimensional clustering; Horseshoe prior; Mixtures of factor analyzers; Natural gradient; Variational approximation

Funding

  1. NUS/BER Research Partnership Grant by the National University of Singapore
  2. Berlin University Alliance
  3. German research foundation (DFG) through the Emmy Noether Grant [KL 3037/1-1]
  4. Volkswagenstiftung [96932]

Ask authors/readers for more resources

This research introduces a new clustering method that combines mixtures of factor analyzers, sparse priors, and Bayesian inference to handle high-dimensional problems. The experimental results demonstrate the effectiveness of the proposed method in both simulated and real data.
Gaussian mixture models are a popular tool for model-based clustering, and mixtures of factor analyzers are Gaussian mixture models having parsimonious factor covariance structure for mixture components. There are several recent extensions of mixture of factor analyzers to deep mixtures, where the Gaussian model for the latent factors is replaced by a mixture of factor analyzers. This construction can be iterated to obtain a model with many layers. These deep models are challenging to fit, and we consider Bayesian inference using sparsity priors to further regularize the estimation. A scalable natural gradient variational inference algorithm is developed for fitting the model, and we suggest computationally efficient approaches to the architecture choice using overfitted mixtures where unnecessary components drop out in the estimation. In a number of simulated and two real examples, we demonstrate the versatility of our approach for high-dimensional problems, and demonstrate that the use of sparsity inducing priors can be helpful for obtaining improved clustering results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available