期刊
NEURAL NETWORKS
卷 138, 期 -, 页码 68-77出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2020.12.027
关键词
Named entity recognition; Unsupervised cross-domain; Adversarial training; Entity-aware attention
资金
- National Natural Science Foundation of China [62076100]
- National Key Research and Development Program of China
- Fundamental Research Funds for the Central Universities, SCUT [2017ZD048, D2182480]
- Science and Technology Planning Project of Guangdong Province [2017B050506004]
- Science and Technology Programs of Guangzhou [201704030076, 201802010027, 201902010046]
- Hong Kong Research Grants Council [C1031-18G]
An unsupervised cross domain model is proposed in this study, which leverages labeled data from the source domain to predict entities in the target domain by applying adversarial training and an entity-aware attention module to reduce feature discrepancy between different domains.
The success of neural network based methods in named entity recognition (NER) is heavily relied on abundant manual labeled data. However, these NER methods are unavailable when the data is fully-unlabeled in a new domain. To address the problem, we propose an unsupervised cross domain model which leverages labeled data from source domain to predict entities in unlabeled target domain. To relieve the distribution divergence when transferring knowledge from source to target domain, we apply adversarial training. Furthermore, we design an entity-aware attention module to guide the adversarial training to reduce the discrepancy of entity features between different domains. Experimental results demonstrate that our model outperforms other methods and achieves state-of-the-art performance. (c) 2020 Published by Elsevier Ltd.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据