4.6 Article

Syntax-Informed Self-Attention Network for Span-Based Joint Entity and Relation Extraction

期刊

APPLIED SCIENCES-BASEL
卷 11, 期 4, 页码 -

出版社

MDPI
DOI: 10.3390/app11041480

关键词

entity recognition; relation extraction; dependency tree; self-attention

资金

  1. National Natural Science Foundation of China (NSFC) [61871046]

向作者/读者索取更多资源

The study proposes a framework that incorporates syntax knowledge and local focus mechanism into entity and relation extraction, significantly enhancing the model performance by fusing syntactic and semantic features as well as obtaining richer contextual features from local context.
Current state-of-the-art joint entity and relation extraction framework is based on span-level entity classification and relation identification between pairs of entity mentions. However, while maintaining an efficient exhaustive search on spans, the importance of syntactic features is not taken into consideration. It will lead to a problem that the prediction of a relation between two entities is related based on corresponding entity types, but in fact they are not related in the sentence. In addition, although previous works have proven that extract local context is beneficial for the task, it still lacks in-depth learning of contextual features in local context. In this paper, we propose to incorporate syntax knowledge into multi-head self-attention by employing part of heads to focus on syntactic parents of each token from pruned dependency trees, and we use it to model the global context to fuse syntactic and semantic features. In addition, in order to get richer contextual features from the local context, we apply local focus mechanism on entity pairs and corresponding context. Based on applying the two strategies, we perform joint entity and relation extraction on span-level. Experimental results show that our model achieves significant improvements on both Conll04 and SciERC dataset compared to strong competitors.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据