3.8 Proceedings Paper

LVBERT: Transformer-Based Model for Latvian Language Understanding

Publisher

IOS PRESS
DOI: 10.3233/FAIA200610

Keywords

Transformers; BERT; language models; Latvian

Funding

  1. Latvian Council of Science, project Latvian Language Understanding and Generation in Human-Computer Interaction [lzp-2018/2-0216]

Ask authors/readers for more resources

This paper presents LVBERT - the first publicly available monolingual language model pre-trained for Latvian. We show that LVBERT improves the state-of-the-art for three Latvian NLP tasks including Part-of-Speech tagging, Named Entity Recognition and Universal Dependency parsing. We release LVBERT to facilitate future research and downstream applications for Latvian NLP.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available