Natural language as the basis for meaning representation and inference

I. Dagan, R Bar-Haim, I Szpektor, I Greental, E Shnarch

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Semantic inference is an important component in many natural language understanding applications. Classical approaches to semantic inference rely on logical representations for meaning, which may be viewed as being “external” to the natural language itself. However, practical applications usually adopt shallower lexical or lexical-syntactic representations, which correspond closely to language structure. In many cases, such approaches lack a principled meaning representation and inference framework. We describe a generic semantic inference framework that operates directly on language-based structures, particularly syntactic trees. New trees are inferred by applying entailment rules, which provide a unified representation for varying types of inferences. Rules were generated by manual and automatic methods, covering generic linguistic structures as well as specific lexical-based inferences. Initial empirical evaluation in a Relation Extraction setting supports the validity and potential of our approach. Additionally, such inference is shown to improve the critical step of unsupervised learning of entailment rules, which in turn enhances the scope of the inference system.
Original languageAmerican English
Title of host publicationComputational Linguistics and Intelligent Text Processing
EditorsAlexander Gelbukh
PublisherSpringer Berlin Heidelberg
Pages151-170
ISBN (Print)978-3-540-78135-6
StatePublished - 2008

Publication series

NameLecture Notes in Computer Science
Volume4919

Fingerprint

Dive into the research topics of 'Natural language as the basis for meaning representation and inference'. Together they form a unique fingerprint.

Cite this