4.6 Article

Co-attention network with label embedding for text classification

Journal

NEUROCOMPUTING
Volume 471, Issue -, Pages 61-69

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2021.10.099

Keywords

Text-label co-attention; Label embedding; Text classification; Natural language processing; Deep learning

Ask authors/readers for more resources

Most existing methods for text classification focus on extracting discriminative text representation, but are computationally inefficient. To improve efficiency, label embedding frameworks are proposed. This paper further utilizes label information by constructing text-attended label representation. Experimental results show competitive performance on multiple classification benchmarks.
Most existing methods for text classification focus on extracting a highly discriminative text representa-tion, which, however, is typically computationally inefficient. To alleviate this issue, label embedding frameworks are proposed to adopt the label-to-text attention that directly uses label information to con-struct the text representation for more efficient text classification. Although these label embedding meth-ods have achieved promising results, there is still much space for exploring how to use the label information more effectively. In this paper, we seek to exploit the label information by further construct-ing the text-attended label representation with text-to-label attention. To this end, we propose a Co-attention Network with Label Embedding (CNLE) that jointly encodes the text and labels into their mutu-ally attended representations. In this way, the model is able to attend to the relevant parts of both. Experiments show that our approach achieves competitive results compared with previous state-of -the-art methods on 7 multi-class classification benchmarks and 2 multi-label classification benchmarks. (c) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available