4.4 Article

ITALIAN-LEGAL-BERT models for improving natural language processing tasks in the Italian legal domain

Journal

COMPUTER LAW & SECURITY REVIEW
Volume 52, Issue -, Pages -

Publisher

ELSEVIER ADVANCED TECHNOLOGY
DOI: 10.1016/j.clsr.2023.105908

Keywords

Italian Legal NLP; Legal AI; Pre-trained Language Models; Italian Legal BERT

Categories

Ask authors/readers for more resources

Legal-BERT models, based on the BERT architecture, have been developed specifically for the legal domain and achieved state-of-the-art performance in complex legal tasks. This paper proposes four versions of Legal-BERT models pre-trained on the Italian legal domain, showing their superiority in several domain-specific tasks.
Legal-BERT models are based on the BERT architecture (or its variants) and have been developed specifically for the legal domain. They have reached the state of the art in complex legal tasks such as legal research, document synthesis, contract analysis, argument extraction, and legal prediction. In this paper, we proposed four versions of Legal-BERT models pre-trained on the Italian legal domain. They aim to improve NLP applications in the Italian legal context. We have shown that they outperforms the Italian generalpurpose BERT in several domain-specific tasks, such as named entity recognition, sentence classification, semantic similarity with Bi-encoders, and document classification.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available