4.7 Article

Fast Online EM for Big Topic Modeling

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2015.2492565

Keywords

Latent Dirichlet allocation; online expectation-maximization; big data; big model; lifelong topic modeling

Funding

  1. NSFC [61373092, 61033013, 61272449, 61572339]
  2. Natural Science Foundation of Jiangsu Higher Education Institutions of China [12KJA520004]
  3. Innovative Research Team in Soochow University [SDT2012B02]
  4. GRF grant from RGC UGC Hong Kong (GRF) [9041574]
  5. City University of Hong Kong [7008026]

Ask authors/readers for more resources

The expectation-maximization (EM) algorithm can compute the maximum-likelihood (ML) or maximum a posterior (MAP) point estimate of the mixture models or latent variable models such as latent Dirichlet allocation (LDA), which has been one of the most popular probabilistic topic modeling methods in the past decade. However, batch EM has high time and space complexities to learn big LDA models from big data streams. In this paper, we present a fast online EM (FOEM) algorithm that infers the topic distribution from the previously unseen documents incrementally with constant memory requirements. Within the stochastic approximation framework, we show that FOEM can converge to the local stationary point of the LDA's likelihood function. By dynamic scheduling for the fast speed and parameter streaming for the low memory usage, FOEM is more efficient for some lifelong topic modeling tasks than the state-of-the-art online LDA algorithms to handle both big data and big models (aka, big topic modeling) on just a PC.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available