4.5 Article

Hierarchical multi-attention networks for document classification

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s13042-020-01260-x

关键词

Document classification; Hierarchical network; Bi-GRU; Attention mechanism

资金

  1. National Statistical Science Research Project of China [2016LY98]
  2. Science and Technology Department of Guangdong Province in China [2016A010101020, 2016A010101021, 2016A010101022]
  3. Characteristic Innovation Projects of Guangdong Colleges and Universities [2018KTSCX049, 2018GKTSCX069]
  4. Bidding Project of Laboratory of Language Engineering and Computing of Guangdong University of Foreign Studies [LEC2019ZBKT005]

向作者/读者索取更多资源

This paper introduces a novel method for document classification, applying different attention strategies at multiple levels to achieve high accuracy. Research indicates that this method is more effective compared to other approaches.
Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据