4.6 Article

Text information aggregation with centrality attention

期刊

SCIENCE CHINA-INFORMATION SCIENCES
卷 64, 期 12, 页码 -

出版社

SCIENCE PRESS
DOI: 10.1007/s11432-019-1519-6

关键词

information aggregation; eigen centrality; text classification; natural language processing; deep learning

资金

  1. National Natural Science Foundation of China [61751201, 61672162]
  2. Shanghai Municipal Science and Technology Major Project [2018SHZDZX01]

向作者/读者索取更多资源

In this study, a new self-attention mechanism called eigen-centrality self-attention is proposed to incorporate higher-order relationships among words in text sequence encoding, leading to better results in multiple tasks compared to baseline models. The power method algorithm is adopted to compute the dominant eigenvector of the graph, and an iterative approach is derived to reduce memory consumption and computation requirement during the process.
A lot of natural language processing problems need to encode the text sequence as a fix-length vector, which usually involves an aggregation process of combining the representations of all the words, such as pooling or self-attention. However, these widely used aggregation approaches do not take higher-order relationships among the words into consideration. Hence we propose a new way of obtaining aggregation weights, called eigen-centrality self-attention. More specifically, we build a fully-connected graph for all the words in a sentence, then compute the eigen-centrality as the attention score of each word. The explicit modeling of relationships as a graph is able to capture some higher-order dependency among words, which helps us achieve better results in 5 text classification tasks and one SNLI task than baseline models such as pooling, self-attention, and dynamic routing. Besides, in order to compute the dominant eigenvector of the graph, we adopt a power method algorithm to get the eigen-centrality measure. Moreover, we also derive an iterative approach to get the gradient for the power method process to reduce both memory consumption and computation requirement.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据