3.8 Proceedings Paper

ClinicalRadioBERT: Knowledge-Infused Few Shot Learning for Clinical Notes Named Entity Recognition

期刊

MACHINE LEARNING IN MEDICAL IMAGING, MLMI 2022
卷 13583, 期 -, 页码 269-278

出版社

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-21014-3_28

关键词

Biomedical language models; Natural language processing; Named entity recognition

向作者/读者索取更多资源

This paper presents ClinicalRadioBERT, a BERT-based model for analyzing clinical notes. By pretraining and fine-tuning on radiotherapy literature, as well as incorporating knowledge-infused few-shot learning, the model demonstrates superior performance in few-shot named entity recognition.
Transformer based language models such as BERT have been widely applied to many domains through model pretraining and fine tuning. However, in low-resource scenarios such as clinical cases, customizing a BERT-based language model is still a challenging task. In this paper, we focus on the radiotherapy domain and train a ClinicalRadioBERT model for analyzing clinical notes through a two-step procedure. First, we fine tune a BioBERT model by exploiting full texts of radiotherapy literature and name this model as RadioBERT. Second, we propose a knowledge-infused few-shot learning (KI-FSL) approach that leverages domain knowledge and trains the ClinicalRadioBERT model for understanding radiotherapy clinical notes. We evaluate ClinicalRadioBERT on a newly collected clinical notes dataset and demonstrate its superiority over baselines on few-shot named entity recognition. We will apply the ClinicalRadioBERT to link BERT and medical imaging for radiotherapy.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据