4.6 Article

An Efficient Framework for Sentence Similarity Modeling

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASLP.2019.2899494

关键词

Sentence similarity; word embedding; attention weight; syntactic structure

资金

  1. National Key R&D Program of China [2018YFB0204100, 2018YFB1004400]
  2. National Natural Science Foundation of China [61472124, 61472453, 61602166, 61702320, U1401256, U1501252, U1611264, U1711261, U1711262, U61811264]

向作者/读者索取更多资源

Sentence similarity modeling lies at the core of many natural language processing applications, and thus has received much attention. Owing to the success of word embeddings, recently, popular neural network methods achieved sentence embedding. Most of them focused on learning semantic information and modeling it as a continuous vector, yet the syntactic information of sentences has not been fully exploited. On the other hand, prior works have shown the benefits of structured trees that include syntactic information, while few methods in this branch utilized the advantages of word embeddings and another powerful technique-attention weight mechanism. This paper suggests to absorb their advantages by merging these techniques in a unified structure, dubbed as attention constituency vector tree (ACVT). Meanwhile, this paper develops a new tree kernel, known as ACVT kernel, which is tailored for sentence similarity measure based on the proposed structure. The experimental results, based on 19 widely used semantic textual similarity datasets, demonstrate that our model is effective and competitive, when compared against state-of-the-art models. Additionally, the experimental results validate that many attention weight mechanisms and word embedding techniques can be seamlessly integrated into our model, demonstrating the robustness and universality of our model.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据