Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, co-reference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearly reach state-of-the-art results on sentence completion, lexical substitution and word sense disambiguation tasks, while substantially outperforming the popular context representation of averaged word embeddings. We release our code and pre-trained models, suggesting they could be useful in a wide variety of NLP tasks.
|Title of host publication||CoNLL 2016 - 20th SIGNLL Conference on Computational Natural Language Learning, Proceedings|
|Publisher||Association for Computational Linguistics (ACL)|
|Number of pages||11|
|State||Published - 2016|
|Event||20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016 - Berlin, Germany|
Duration: 11 Aug 2016 → 12 Aug 2016
|Name||CoNLL 2016 - 20th SIGNLL Conference on Computational Natural Language Learning, Proceedings|
|Conference||20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016|
|Period||11/08/16 → 12/08/16|
Bibliographical noteFunding Information:
We thank our anonymous reviewers for their useful comments. This work was partially supported by the Israel Science Foundation grant 880/12 and the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1).
© 2016 Association for Computational Linguistics.