Abstract
Pre-trained language models (LMs) may perpetuate biases originating in their training corpus to downstream models. We focus on artifacts associated with the representation of given names (e.g., Donald), which, depending on the corpus, may be associated with specific entities, as indicated by next token prediction (e.g., Trump). While helpful in some contexts, grounding happens also in underspecified or inappropriate contexts. For example, endings generated for 'Donald is a' substantially differ from those of other names, and often have more-than-average negative sentiment. We demonstrate the potential effect on downstream tasks with reading comprehension probes where name perturbation changes the model answers. As a silver lining, our experiments suggest that additional pre-training on different corpora may mitigate this bias.
Original language | English |
---|---|
Title of host publication | EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 6850-6861 |
Number of pages | 12 |
ISBN (Electronic) | 9781952148606 |
DOIs | |
State | Published - 2020 |
Externally published | Yes |
Event | 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 - Virtual, Online Duration: 16 Nov 2020 → 20 Nov 2020 |
Publication series
Name | EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
---|
Conference
Conference | 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 |
---|---|
City | Virtual, Online |
Period | 16/11/20 → 20/11/20 |
Bibliographical note
Publisher Copyright:© 2020 Association for Computational Linguistics
Funding
This research was supported in part by NSF (IIS-1524371, IIS-1714566), DARPA under the CwC program through the ARO (W911NF-15-1-0543), and DARPA under the MCS program through NIWC Pacific (N66001-19-2-4031).
Funders | Funder number |
---|---|
National Science Foundation | IIS-1714566, IIS-1524371 |
Army Research Office | W911NF-15-1-0543 |
Defense Advanced Research Projects Agency | |
Naval Information Warfare Center Pacific | N66001-19-2-4031 |