4.7 Article

BertMCN: Mapping colloquial phrases to standard medical concepts using BERT and highway network

期刊

ARTIFICIAL INTELLIGENCE IN MEDICINE
卷 112, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.artmed.2021.102008

关键词

Medical Concept Normalization; Clinical Natural Language Processing; BERT; Highway Network

向作者/读者索取更多资源

In this study, a medical concept normalization system based on BERT and highway layer is proposed, and experimental results show that it outperforms existing methods on two standard datasets. Furthermore, the impact of different learning rates and batch sizes, noise, and freezing encoder layers on the model was also investigated through a series of experiments.
In the last few years, people started to share lots of information related to health in the form of tweets, reviews and blog posts. All these user generated clinical texts can be mined to generate useful insights. However, automatic analysis of clinical text requires identification of standard medical concepts. Most of the existing deep learning based medical concept normalization systems are based on CNN or RNN. Performance of these models is limited as they have to be trained from scratch (except embeddings). In this work, we propose a medical concept normalization system based on BERT and highway layer. BERT, a pre-trained context sensitive deep language representation model advanced state-of-the-art performance in many NLP tasks and gating mechanism in highway layer helps the model to choose only important information. Experimental results show that our model outperformed all existing methods on two standard datasets. Further, we conduct a series of experiments to study the impact of different learning rates and batch sizes, noise and freezing encoder layers on our model.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据