4.6 Article

Chinese Named Entity Recognition Model Based on Multi-Task Learning

Journal

APPLIED SCIENCES-BASEL
Volume 13, Issue 8, Pages -

Publisher

MDPI
DOI: 10.3390/app13084770

Keywords

multi-task learning; Chinese named entity recognition; joint learning; feature interaction; bi-directional encoder (BERT)

Ask authors/readers for more resources

Compared to English, Chinese named entity recognition has lower performance due to the greater ambiguity in entity boundaries in Chinese text. To leverage entity boundary information, the task has been decomposed into two subtasks: boundary annotation and type annotation. A multi-task learning network (MTL-BERT) has been proposed that combines a bidirectional encoder (BERT) model, effectively improving the performance and efficiency of Chinese named entity recognition tasks.
Compared to English, Chinese named entity recognition has lower performance due to the greater ambiguity in entity boundaries in Chinese text, making boundary prediction more difficult. While traditional models have attempted to enhance the definition of Chinese entity boundaries by incorporating external features such as lexicons or glyphs, they have rarely disentangled the entity boundary prediction problem for separate study. In order to leverage entity boundary information, the named entity recognition task has been decomposed into two subtasks: boundary annotation and type annotation, and a multi-task learning network (MTL-BERT) has been proposed that combines a bidirectional encoder (BERT) model. This network performs joint encoding and specific decoding of the subtasks, enhancing the model's feature extraction abilities by reinforcing the feature associations between subtasks. Multiple sets of experiments conducted on Weibo NER, MSRA, and OntoNote4.0 public datasets show that the F1 values of MTL-BERT reach 73.8%, 96.5%, and 86.7%, respectively, effectively improving the performance and efficiency of Chinese named entity recognition tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available