3.8 Proceedings Paper

ClinicalRadioBERT: Knowledge-Infused Few Shot Learning for Clinical Notes Named Entity Recognition

Journal

MACHINE LEARNING IN MEDICAL IMAGING, MLMI 2022
Volume 13583, Issue -, Pages 269-278

Publisher

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-21014-3_28

Keywords

Biomedical language models; Natural language processing; Named entity recognition

Ask authors/readers for more resources

This paper presents ClinicalRadioBERT, a BERT-based model for analyzing clinical notes. By pretraining and fine-tuning on radiotherapy literature, as well as incorporating knowledge-infused few-shot learning, the model demonstrates superior performance in few-shot named entity recognition.
Transformer based language models such as BERT have been widely applied to many domains through model pretraining and fine tuning. However, in low-resource scenarios such as clinical cases, customizing a BERT-based language model is still a challenging task. In this paper, we focus on the radiotherapy domain and train a ClinicalRadioBERT model for analyzing clinical notes through a two-step procedure. First, we fine tune a BioBERT model by exploiting full texts of radiotherapy literature and name this model as RadioBERT. Second, we propose a knowledge-infused few-shot learning (KI-FSL) approach that leverages domain knowledge and trains the ClinicalRadioBERT model for understanding radiotherapy clinical notes. We evaluate ClinicalRadioBERT on a newly collected clinical notes dataset and demonstrate its superiority over baselines on few-shot named entity recognition. We will apply the ClinicalRadioBERT to link BERT and medical imaging for radiotherapy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available