Empirical Evaluation of Pretraining Strategies for Supervised Entity Linking

Thibault Févry, Nicholas FitzGerald, Livio Baldini Soares, Tom Kwiatkowski

Keywords: Entity linking, Pre-training, Wikification

Abstract: In this work, we present an entity linking model which combines a Transformer architecture with large scale pretraining from Wikipedia links. Our model achieves the state-of-the-art on two commonly used entity linking datasets: 96.7% on CoNLL and 94.9% on TAC-KBP. We present detailed analyses to understand what design choices are important for entity linking, including choices of negative entity candidates, Transformer architecture, and input perturbations. Lastly, we present promising results on more challenging settings such as end-to-end entity linking and entity linking without in-domain training data

This is an embedded video. Talk and the respective paper are published at AKBC 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.


Post Comment
no comments yet
code of conduct: tbd

Similar Papers