4.7 Article

Dispatched attention with multi-task learning for nested mention recognition

期刊

INFORMATION SCIENCES
卷 513, 期 -, 页码 241-251

出版社

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2019.10.065

关键词

Nested mention; Named entity recognition; Attention mechanism; Neural network; Multi-task learning; Conditional random field

资金

  1. National Natural Science Foundation of China [61702121, 61772378]
  2. Research Foundation of Ministry of Education of China [18jZD015]
  3. Major Projects of the National Social Science Foundation of China [11ZD189]
  4. Science and Technology Project of Guangzhou [201704030002]
  5. Bidding Project of GDUFS Laboratory of Language Engineering and Computing [LEC2018ZBKT004]

向作者/读者索取更多资源

Entity mentions usually contain other mention in the task of named entity recognition (NER). Nested entities pose challenge to the task of NER. Existing methods fail to sufficiently capture the boundaries information between nested entities, which limits the performance of the task. In this paper, we propose a dispatched attention neural model with multi-task learning for the task. In particular, given an input sentence, a bi-directional Long Short Term Memory (BiLSTM) encodes it as common contextualized hidden representation. Then position and syntax information are leveraged into attention network for capturing mention span features. The attention representation of each task is dispatched to subsequent task to exchange boundaries information for nested mentions. Finally, Conditional Random Fields (CRFs) are used to extract nested mentions in an inside-out order for each task. Results on ACE2005 and GENIA datasets show that the proposed model outperforms state-of-the-art systems, showing its effectiveness in detecting nested mentions. (C) 2019 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据