期刊
APPLIED SCIENCES-BASEL
卷 12, 期 12, 页码 -出版社
MDPI
DOI: 10.3390/app12126231
关键词
named entity recognition; relation extraction; semantic role labeling; Bi-directional long short-term memory network; pre-trained model
类别
资金
- National Key R&D Program of China [2019YFC1521202]
This paper proposes a Joint Entity and Relation Extraction Network with Enhanced Explicit and Implicit Semantic Information (EINET). By introducing explicit semantics from Semantic Role Labeling and using different Bi-directional Long Short-Term Memory networks to encode entities and local contexts, the model achieves competitive results in entity and relation extraction.
The main purpose of the joint entity and relation extraction is to extract entities from unstructured texts and extract the relation between labeled entities at the same time. At present, most existing joint entity and relation extraction networks ignore the utilization of explicit semantic information and explore implicit semantic information insufficiently. In this paper, we propose Joint Entity and Relation Extraction Network with Enhanced Explicit and Implicit Semantic Information (EINET). First, on the premise of using the pre-trained model, we introduce explicit semantics from Semantic Role Labeling (SRL), which contains rich semantic features about the entity types and relation of entities. Then, to enhance the implicit semantic information and extract richer features of the entity and local context, we adopt different Bi-directional Long Short-Term Memory (Bi-LSTM) networks to encode entities and local contexts, respectively. In addition, we propose to integrate global semantic information and local context length representation in relation extraction to further improve the model performance. Our model achieves competitive results on three publicly available datasets. Compared with the baseline model on Conll04, EINET obtains improvements by 2.37% in F1 for named entity recognition and 3.43% in F1 for relation extraction.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据