Journal
HUMAN LANGUAGE TECHNOLOGIES - THE BALTIC PERSPECTIVE (HLT 2020)
Volume 328, Issue -, Pages 111-115Publisher
IOS PRESS
DOI: 10.3233/FAIA200610
Keywords
Transformers; BERT; language models; Latvian
Categories
Funding
- Latvian Council of Science, project Latvian Language Understanding and Generation in Human-Computer Interaction [lzp-2018/2-0216]
Ask authors/readers for more resources
This paper presents LVBERT - the first publicly available monolingual language model pre-trained for Latvian. We show that LVBERT improves the state-of-the-art for three Latvian NLP tasks including Part-of-Speech tagging, Named Entity Recognition and Universal Dependency parsing. We release LVBERT to facilitate future research and downstream applications for Latvian NLP.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available