4.6 Article

Entity disambiguation with memory network

期刊

NEUROCOMPUTING
卷 275, 期 -, 页码 2367-2373

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2017.11.013

关键词

Entity disambiguation; Memory network; Natural language processing; Deep learning

向作者/读者索取更多资源

We develop a computational approach based on memory network for entity disambiguation. The approach automatically finds important clues of a mention from surrounding contexts with attention mechanism, and leverages these clues to facilitate entity disambiguation. Unlike existing feature-based methods, this approach does not rely on any manually designed features. Unlike existing neural models such as recurrent or convolutional neural network, this approach leverages the importance of context words in an explicit way. The model could be easily trained with back-propagation. To effectively learn the model parameters, we automatically collect large-scale mention-entity pairs from Wikipedia as training data. We verify the effectiveness of the proposed approach on a benchmark dataset from TAC-KBP evaluation in 2010. Experimental results demonstrate that our approach empirically surpasses strong feature based and neural network based methods. Model analysis further reveals that our approach has the capacity to discover important clues from contexts. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据