Abstract
Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, co-reference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearly reach state-of-the-art results on sentence completion, lexical substitution and word sense disambiguation tasks, while substantially outperforming the popular context representation of averaged word embeddings. We release our code and pre-trained models, suggesting they could be useful in a wide variety of NLP tasks.
Original language | English |
---|---|
Title of host publication | CoNLL 2016 - 20th SIGNLL Conference on Computational Natural Language Learning, Proceedings |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 51-61 |
Number of pages | 11 |
ISBN (Electronic) | 9781945626197 |
DOIs | |
State | Published - 2016 |
Event | 20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016 - Berlin, Germany Duration: 11 Aug 2016 → 12 Aug 2016 |
Publication series
Name | CoNLL 2016 - 20th SIGNLL Conference on Computational Natural Language Learning, Proceedings |
---|
Conference
Conference | 20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016 |
---|---|
Country/Territory | Germany |
City | Berlin |
Period | 11/08/16 → 12/08/16 |
Bibliographical note
Publisher Copyright:© 2016 Association for Computational Linguistics.