4.6 Article

Label Attention Network for Structured Prediction

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASLP.2022.3145311

关键词

Labeling; Task analysis; Tagging; Artificial neural networks; Machine translation; Natural language processing; Encoding; Label attention; label dependency; sequence labeling

资金

  1. National Science Foundation of China [61976180]

向作者/读者索取更多资源

Sequence labeling, a fundamental problem in NLP, assigns labels to tokens in a sequence. We propose a label attention network (LAN) that hierarchically refines label distributions, improving tagging accuracy and speeding up training and testing.
Sequence labeling assigns a label to each token in a sequence, which is a fundamental problem in natural language processing (NLP). Many NLP tasks, including part-of-speech tagging and named entity recognition, can be solved in a form of sequence labeling problem. Other tasks such as constituency parsing and non-autoregressive machine translation can also be transformed into sequence labeling tasks. Neural models have been shown powerful for sequence labeling by employing a multi-layer sequence encoding network. Conditional random field (CRF) is proposed to enrich information over label sequences, yet it suffers large computational complexity and over-reliance on Marko assumption. To this end, we propose label attention network (LAN) to hierarchically refine representation of marginal label distributions bottom-up, enabling higher layers to learn more informed label sequence distribution based on information from lower layers. We demonstrate the effectiveness of LAN through extensive experiments on various NLP tasks including POS tagging, NER, CCG supertagging, constituency parsing and non-autoregressive machine translation. Empirical results show that LAN not only improves the overall tagging accuracy with similar number of parameters, but also significantly speeds up the training and testing compared to CRF.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据