Semantics-aware Attention Improves Neural Machine Translation

Aviv Slobodkin, Leshem Choshen, Omri Abend

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

The integration of syntactic structures into Transformer machine translation has shown positive results, but to our knowledge, no work has attempted to do so with semantic structures. In this work we propose two novel parameter-free methods for injecting semantic information into Transformers, both rely on semantics-aware masking of (some of) the attention heads. One such method operates on the encoder, through a Scene-Aware Self-Attention (SASA) head. Another on the decoder, through a Scene-Aware Cross-Attention (SACrA) head. We show a consistent improvement over the vanilla Transformer and syntax-aware models for four language pairs. We further show an additional gain when using both semantic and syntactic structures in some language pairs.

Original languageEnglish
Title of host publication*SEM 2022 - 11th Joint Conference on Lexical and Computational Semantics, Proceedings of the Conference
EditorsVivi Nastase, Ellie Pavlick, Mohammad Taher Pilehvar, Jose Camacho-Collados, Alessandro Raganato
PublisherAssociation for Computational Linguistics (ACL)
Pages28-43
Number of pages16
ISBN (Electronic)9781955917988
DOIs
StatePublished - 2022
Externally publishedYes
Event11th Joint Conference on Lexical and Computational Semantics, *SEM 2022 - Seattle, United States
Duration: 14 Jul 202215 Jul 2022

Publication series

Name*SEM 2022 - 11th Joint Conference on Lexical and Computational Semantics, Proceedings of the Conference

Conference

Conference11th Joint Conference on Lexical and Computational Semantics, *SEM 2022
Country/TerritoryUnited States
CitySeattle
Period14/07/2215/07/22

Bibliographical note

Publisher Copyright:
© 2022 Association for Computational Linguistics.

Funding

This work was supported in part by the Israel Science Foundation (grant no. 2424/21), and by the Applied Research in Academia Program of the Israel Innovation Authority.

FundersFunder number
Israel Innovation Authority
Israel Science Foundation2424/21

    Fingerprint

    Dive into the research topics of 'Semantics-aware Attention Improves Neural Machine Translation'. Together they form a unique fingerprint.

    Cite this