4.6 Article

Boosting implicit discourse relation recognition with connective-based word embeddings

期刊

NEUROCOMPUTING
卷 369, 期 -, 页码 39-49

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2019.08.081

关键词

Connective-based word embeddings; Implicit discourse relation recognition; Connective classification; Neural network

资金

  1. Natural Science Foundation of China [61866012]
  2. Natural Science Foundation of Jiangxi Province [20181BAB202012]
  3. Science and Technology Research Project of Education Department of Jiangxi Province [GJJ180329]

向作者/读者索取更多资源

Implicit discourse relation recognition is the performance bottleneck of discourse structure analysis. To alleviate the shortage of training data, previous methods usually use explicit discourse data, which are naturally labeled by connectives, as additional training data. However, it is often difficult for them to integrate large amounts of explicit discourse data because of the noise problem. In this paper, we propose a simple and effective method to leverage massive explicit discourse data. Specifically, we learn connective-based word embeddings (CBWE) by performing connective classification on explicit discourse data. The learned CBWE is capable of capturing discourse relationships between words, and can be used as pre-trained word embeddings for implicit discourse relation recognition. On both the English PDTB and Chinese CDTB data sets, using CBWE achieves significant improvements over baselines with general word embeddings, and better performance than baselines integrating explicit discourse data. By combining CBWE with a strong baseline, we achieve the state-of-the-art performance. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据