Teach the Rules, Provide the Facts: Targeted Relational-knowledge Enhancement for Textual Inference

Ohad Rozen, Shmuel Amar, Vered Shwartz, Ido Dagan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

We present InferBert, a method to enhance transformer-based inference models with relevant relational knowledge. Our approach facilitates learning generic inference patterns requiring relational knowledge (e.g. inferences related to hypernymy) during training, while injecting on-demand the relevant relational facts (e.g. pangolin is an animal) at test time. We apply InferBERT to the NLI task over a diverse set of inference types (hypernymy, location, color, and country of origin), for which we collected challenge datasets. In this setting, InferBert succeeds to learn general inference patterns, from a relatively small number of training instances, while not hurting performance on the original NLI data and substantially outperforming prior knowledge enhancement models on the challenge data. It further applies its inferences successfully at test time to previously unobserved entities. InferBert is computationally more efficient than most prior methods, in terms of number of parameters, memory consumption and training time.

Original languageEnglish
Title of host publication*SEM 2021 - 10th Conference on Lexical and Computational Semantics, Proceedings of the Conference
EditorsLun-Wei Ku, Vivi Nastase, Ivan Vulic
PublisherAssociation for Computational Linguistics (ACL)
Pages89-98
Number of pages10
ISBN (Electronic)9781954085770
StatePublished - 2021
Event10th Conference on Lexical and Computational Semantics, *SEM 2021 - Virtual, Bangkok, Thailand
Duration: 5 Aug 20216 Aug 2021

Publication series

Name*SEM 2021 - 10th Conference on Lexical and Computational Semantics, Proceedings of the Conference

Conference

Conference10th Conference on Lexical and Computational Semantics, *SEM 2021
Country/TerritoryThailand
CityVirtual, Bangkok
Period5/08/216/08/21

Bibliographical note

Publisher Copyright:
© 2021 Lexical and Computational Semantics

Funding

The work described herein was supported in part by grants from Intel Labs, Facebook, the Israel Science Foundation grant 1951/17, the Israeli Ministry of Science and Technology and the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1).

FundersFunder number
DIPDA 1600/1-1
German-Israeli Project Cooperation
Intel Labs
Deutsche Forschungsgemeinschaft
Israel Science Foundation1951/17
Ministry of science and technology, Israel

    Fingerprint

    Dive into the research topics of 'Teach the Rules, Provide the Facts: Targeted Relational-knowledge Enhancement for Textual Inference'. Together they form a unique fingerprint.

    Cite this