Abstract
Despite their popularity in non-English NLP, multilingual language models often underperform monolingual ones due to inter-language competition for model parameters. We propose Cross-lingual Expert Language Models (X-ELM), which mitigate this competition by independently training language models on subsets of the multilingual corpus. This process specializes X-ELMs to different languages while remaining effective as a multilingual ensemble. Our experiments show that when given the same compute budget, X-ELM outperforms jointly trained multilingual models across all 16 considered languages and that these gains transfer to downstream tasks. X-ELM provides additional benefits over performance improvements: new experts can be iteratively added, adapting X-ELM to new languages without catastrophic forgetting. Furthermore, training is asynchronous, reducing the hardware requirements for multilingual training and democratizing multilingual modeling.
Original language | English |
---|---|
Title of host publication | EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
Editors | Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 10822-10837 |
Number of pages | 16 |
ISBN (Electronic) | 9798891761643 |
DOIs | |
State | Published - 2024 |
Externally published | Yes |
Event | 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024 - Hybrid, Miami, United States Duration: 12 Nov 2024 → 16 Nov 2024 |
Publication series
Name | EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
---|
Conference
Conference | 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024 |
---|---|
Country/Territory | United States |
City | Hybrid, Miami |
Period | 12/11/24 → 16/11/24 |
Bibliographical note
Publisher Copyright:© 2024 Association for Computational Linguistics.