Dependency-based word embeddings

Omer Levy, Yoav Goldberg

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

903 Scopus citations

Abstract

While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. In particular, we perform experiments with dependency-based contexts, and show that they produce markedly different embeddings. The dependencybased embeddings are less topical and exhibit more functional similarity than the original skip-gram embeddings.

Original languageEnglish
Title of host publicationLong Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages302-308
Number of pages7
ISBN (Print)9781937284732
DOIs
StatePublished - 2014
Event52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Baltimore, MD, United States
Duration: 22 Jun 201427 Jun 2014

Publication series

Name52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference
Volume2

Conference

Conference52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014
Country/TerritoryUnited States
CityBaltimore, MD
Period22/06/1427/06/14

Fingerprint

Dive into the research topics of 'Dependency-based word embeddings'. Together they form a unique fingerprint.

Cite this