4.6 Article

Co-attention network with label embedding for text classification

期刊

NEUROCOMPUTING
卷 471, 期 -, 页码 61-69

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2021.10.099

关键词

Text-label co-attention; Label embedding; Text classification; Natural language processing; Deep learning

向作者/读者索取更多资源

Most existing methods for text classification focus on extracting discriminative text representation, but are computationally inefficient. To improve efficiency, label embedding frameworks are proposed. This paper further utilizes label information by constructing text-attended label representation. Experimental results show competitive performance on multiple classification benchmarks.
Most existing methods for text classification focus on extracting a highly discriminative text representa-tion, which, however, is typically computationally inefficient. To alleviate this issue, label embedding frameworks are proposed to adopt the label-to-text attention that directly uses label information to con-struct the text representation for more efficient text classification. Although these label embedding meth-ods have achieved promising results, there is still much space for exploring how to use the label information more effectively. In this paper, we seek to exploit the label information by further construct-ing the text-attended label representation with text-to-label attention. To this end, we propose a Co-attention Network with Label Embedding (CNLE) that jointly encodes the text and labels into their mutu-ally attended representations. In this way, the model is able to attend to the relevant parts of both. Experiments show that our approach achieves competitive results compared with previous state-of -the-art methods on 7 multi-class classification benchmarks and 2 multi-label classification benchmarks. (c) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据