4.7 Article

DWIE: An entity-centric dataset for multi-task document-level information extraction

期刊

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.ipm.2021.102563

关键词

Named entity recognition; Entity linking; Relation extraction; Coreference resolution; Joint models; Graph Neural Networks

资金

  1. European Union [761488]
  2. Flemish Government, Belgium

向作者/读者索取更多资源

DWIE is a newly created multi-task dataset that combines four main Information Extraction (IE) annotation subtasks, focusing on entity-centric descriptions of interactions and properties of conceptual entities. Challenges in building and evaluating IE models for DWIE include the need for a new entity-driven metric to avoid dominance of frequently mentioned entities and the requirement for information transfer between entity mentions in different parts of the document and across different tasks. Incorporating neural graph propagation into a joint model showed significant improvement in F-1 percentage points, showcasing DWIE's potential to stimulate research in graph neural networks for multi-task IE.
This paper presents DWIE, the 'Deutsche Welle corpus for Information Extraction', a newly created multi-task dataset that combines four main Information Extraction (IE) annotation subtasks: (i) Named Entity Recognition (NER), (ii) Coreference Resolution, (iii) Relation Extraction (RE), and (iv) Entity Linking. DWIE is conceived as an entity-centric dataset that describes interactions and properties of conceptual entities on the level of the complete document. This contrasts with currently dominant mention-driven approaches that start from the detection and classification of named entity mentions in individual sentences. Further, DWIE presented two main challenges when building and evaluating IE models for it. First, the use of traditional mention-level evaluation metrics for NER and RE tasks on entity-centric DWIE dataset can result in measurements dominated by predictions on more frequently mentioned entities. We tackle this issue by proposing a new entity-driven metric that takes into account the number of mentions that compose each of the predicted and ground truth entities. Second, the document-level multi-task annotations require the models to transfer information between entity mentions located in different parts of the document, as well as between different tasks, in a joint learning setting. To realize this, we propose to use graph-based neural message passing techniques between document-level mention spans. Our experiments show an improvement of up to 5.5 F-1 percentage points when incorporating neural graph propagation into our joint model. This demonstrates DWIE's potential to stimulate further research in graph neural networks for representation learning in multi-task IE. We make DWIE publicly available at https://github.com/klimzaporojets/DWIE.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据