4.7 Article

Towards zero-shot cross-lingual named entity disambiguation

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 184, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.115542

Keywords

Cross-lingual named entity disambiguation; Cross-lingual entity linking; Zero-shot learning; Transfer learning; Pre-trained language models; Low-resource languages

Funding

  1. Basque Government [IT1343-19]
  2. Project BigKnowledge (Ayudas Fundacion BBVA a equipos de investigacion cientifica 2018)
  3. IARPA BETTER Program [2019-19051600006]
  4. UPV/EHU [ESPDOC18/101]
  5. NVIDIA Corporation

Ask authors/readers for more resources

This study introduces a zero-shot XNED architecture, eliminating the need for native prior probabilities by having a model for each possible mention string, resulting in significant improvements on XNED datasets in Spanish and Chinese.
In cross-Lingual Named Entity Disambiguation (XNED) the task is to link Named Entity mentions in text in some native language to English entities in a knowledge graph. XNED systems usually require training data for each native language, limiting their application for low resource languages with small amounts of training data. Prior work have proposed so-called zero-shot transfer systems which are only trained in English training data, but required native prior probabilities of entities with respect to mentions, which had to be estimated from native training examples, limiting their practical interest. In this work we present a zero-shot XNED architecture where, instead of a single disambiguation model, we have a model for each possible mention string, thus eliminating the need for native prior probabilities. Our system improves over prior work in XNED datasets in Spanish and Chinese by 32 and 27 points, and matches the systems which do require native prior information. We experiment with different multilingual transfer strategies, showing that better results are obtained with a purpose-built multilingual pre-training method compared to state-of-the-art generic multilingual models such as XLM-R. We also discovered, surprisingly, that English is not necessarily the most effective zero-shot training language for XNED into English. For instance, Spanish is more effective when training a zero-shot XNED system that dis-ambiguates Basque mentions with respect to an English knowledge graph.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available