4.0 Article

Hierarchical Hybrid Neural Networks With Multi-Head Attention for Document Classification

出版社

IGI GLOBAL
DOI: 10.4018/IJDWM.303673

关键词

BiGRU; CNN; Document Level; Hierarchical Characteristics; Hybrid Attention Network; Multi-Head Self-Attention Mechanism; NLP; Text Classification

向作者/读者索取更多资源

This paper proposes a hierarchical hybrid neural network with multi-head attention (HHNN-MHA) model for document classification. Experimental results demonstrate the effectiveness and competitiveness of the proposed method.
Document classification is a research topic aiming to predict the overall text sentiment polarity with the advent of deep neural networks. Various deep learning algorithms have been employed in the current studies to improve classification performance. To this end, this paper proposes a hierarchical hybrid neural network with multi-head attention (HHNN-MHA) model on the task of document classification. The proposed model contains two layers to deal with the word-sentence level and sentence-document level classification respectively. In the first layer, CNN is integrated into Bi-GRU and a multi-head attention mechanism is employed in order to exploit local and global features. Then, both Bi-GRU and attention mechanism are applied to document processing and classification in the second layer. Experiments on four datasets demonstrate the effectiveness of the proposed method. Compared to the state-of-the-art methods, the model achieves competitive results in document classification in terms of experimental performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据