4.0 Article

Topic Modeling for Interpretable Text Classification From EHRs

期刊

FRONTIERS IN BIG DATA
卷 5, 期 -, 页码 -

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fdata.2022.846930

关键词

text classification; topic modeling; explainability; interpretability; electronic health records; psychiatry; natural language processing; information extraction

资金

  1. TU/e
  2. WUR
  3. UU
  4. UMC Utrecht

向作者/读者索取更多资源

The article discusses the use of topic models for text classification of clinical notes in predictive tasks and how to select a suitable topic model. The study found that there is no correlation between interpretability and predictive performance, with the proposed fuzzy topic modeling algorithm showing the best interpretability while two state-of-the-art methods perform the best in predictive performance.
The clinical notes in electronic health records have many possibilities for predictive tasks in text classification. The interpretability of these classification models for the clinical domain is critical for decision making. Using topic models for text classification of electronic health records for a predictive task allows for the use of topics as features, thus making the text classification more interpretable. However, selecting the most effective topic model is not trivial. In this work, we propose considerations for selecting a suitable topic model based on the predictive performance and interpretability measure for text classification. We compare 17 different topic models in terms of both interpretability and predictive performance in an inpatient violence prediction task using clinical notes. We find no correlation between interpretability and predictive performance. In addition, our results show that although no model outperforms the other models on both variables, our proposed fuzzy topic modeling algorithm (FLSA-W) performs best in most settings for interpretability, whereas two state-of-the-art methods (ProdLDA and LSI) achieve the best predictive performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据