04/07/2020

Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly

Nora Kassner, Hinrich Schütze

Keywords: Pretrained Models, probing tasks, Negation, Pretrained Models

Abstract: Building on Petroni et al. 2019, we propose two new probing tasks analyzing factual knowledge stored in Pretrained Language Models (PLMs). (1) Negation. We find that PLMs do not distinguish between negated (``Birds cannot [MASK]'') and non-negated (``Birds can [MASK]'') cloze questions. (2) Mispriming. Inspired by priming methods in human psychology, we add ``misprimes'' to cloze questions (``Talk? Birds can [MASK]''). We find that PLMs are easily distracted by misprimes. These results suggest that PLMs still have a long way to go to adequately learn human-like factual knowledge.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ACL 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers