Feature vector quality and distributional similarity

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


We suggest a new goal and evaluation criterion for word similarity measures. The new criterion - meaning-entailing substitutability - fits the needs of semantic-oriented NLP applications and can be evaluated directly (independent of an application) at a good level of human agreement. Motivated by this semantic criterion we analyze the empirical quality of distributional word feature vectors and its impact on word similarity results, proposing an objective measure for evaluating feature vector quality. Finally, a novel feature weighting and selection function is presented, which yields superior feature vectors and better word similarity performance.
Original languageAmerican English
Title of host publicationThe 20th International Conference on Computational Linguistics COLING 2004
StatePublished - 2004

Bibliographical note

Place of conference:Geneve, Switzerland


Dive into the research topics of 'Feature vector quality and distributional similarity'. Together they form a unique fingerprint.

Cite this