4.7 Article

Fast Online EM for Big Topic Modeling

期刊

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2015.2492565

关键词

Latent Dirichlet allocation; online expectation-maximization; big data; big model; lifelong topic modeling

资金

  1. NSFC [61373092, 61033013, 61272449, 61572339]
  2. Natural Science Foundation of Jiangsu Higher Education Institutions of China [12KJA520004]
  3. Innovative Research Team in Soochow University [SDT2012B02]
  4. GRF grant from RGC UGC Hong Kong (GRF) [9041574]
  5. City University of Hong Kong [7008026]

向作者/读者索取更多资源

The expectation-maximization (EM) algorithm can compute the maximum-likelihood (ML) or maximum a posterior (MAP) point estimate of the mixture models or latent variable models such as latent Dirichlet allocation (LDA), which has been one of the most popular probabilistic topic modeling methods in the past decade. However, batch EM has high time and space complexities to learn big LDA models from big data streams. In this paper, we present a fast online EM (FOEM) algorithm that infers the topic distribution from the previously unseen documents incrementally with constant memory requirements. Within the stochastic approximation framework, we show that FOEM can converge to the local stationary point of the LDA's likelihood function. By dynamic scheduling for the fast speed and parameter streaming for the low memory usage, FOEM is more efficient for some lifelong topic modeling tasks than the state-of-the-art online LDA algorithms to handle both big data and big models (aka, big topic modeling) on just a PC.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据