4.5 Article

ParsBERT: Transformer-based Model for Persian Language Understanding

期刊

NEURAL PROCESSING LETTERS
卷 53, 期 6, 页码 3831-3847

出版社

SPRINGER
DOI: 10.1007/s11063-021-10528-4

关键词

Persian; Transformers; BERT; Language Models; NLP; NLU

向作者/读者索取更多资源

The rise of pre-trained language models has ushered in a new era in the field of Natural Language Processing, allowing for the development of powerful language models. This paper introduces a monolingual BERT model for the Persian language (ParsBERT) along with a large dataset for various NLP tasks, achieving state-of-the-art performance.
The surge of pre-trained language models has begun a new era in the field of Natural Language Processing (NLP) by allowing us to build powerful language models. Among these models, Transformer-based models such as BERT have become increasingly popular due to their state-of-the-art performance. However, these models are usually focused on English, leaving other languages to multilingual models with limited resources. This paper proposes a monolingual BERT for the Persian language (ParsBERT), which shows its state-of-the-art performance compared to other architectures and multilingual models. Also, since the amount of data available for NLP tasks in Persian is very restricted, a massive dataset for different NLP tasks as well as pre-training the model is composed. ParsBERT obtains higher scores in all datasets, including existing ones and gathered ones, and improves the state-of-the-art performance by outperforming both multilingual BERT and other prior works in Sentiment Analysis, Text Classification, and Named Entity Recognition tasks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据