Journal
MACHINE LEARNING
Volume 95, Issue 3, Pages 423-469Publisher
SPRINGER
DOI: 10.1007/s10994-013-5413-0
Keywords
Topic models; Latent Dirichlet Allocation; Feedback; Interactive topic modeling; Online learning; Gibbs sampling
Categories
Funding
- National Science Foundation [0705832, 1018625]
- Army Research Laboratory [W911NF-09-2-0072]
- Division of Computing and Communication Foundations
- Direct For Computer & Info Scie & Enginr [1018625] Funding Source: National Science Foundation
- Div Of Information & Intelligent Systems
- Direct For Computer & Info Scie & Enginr [0705832] Funding Source: National Science Foundation
Ask authors/readers for more resources
Topic models are a useful and ubiquitous tool for understanding large corpora. However, topic models are not perfect, and for many users in computational social science, digital humanities, and information studies-who are not machine learning experts-existing models and frameworks are often a take it or leave it proposition. This paper presents a mechanism for giving users a voice by encoding users' feedback to topic models as correlations between words into a topic model. This framework, interactive topic modeling (itm), allows untrained users to encode their feedback easily and iteratively into the topic models. Because latency in interactive systems is crucial, we develop more efficient inference algorithms for tree-based topic models. We validate the framework both with simulated and real users.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available