Abstract
Data-to-text generation can be conceptually divided into two parts: ordering and structuring the information (planning), and generating fluent language describing the information (realization). Modern neural generation systems conflate these two steps into a single end-to-end differentiable system. We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization. For training a plan-to-text generator, we present a method for matching reference texts to their corresponding text plans. For inference time, we describe a method for selecting high-quality text plans for new inputs. We implement and evaluate our approach on the WebNLG benchmark. Our results demonstrate that decoupling text planning from neural realization indeed improves the system's reliability and adequacy while maintaining fluent output. We observe improvements both in BLEU scores and in manual evaluations. Another benefit of our approach is the ability to output diverse realizations of the same input, paving the way to explicit control over the generated text structure.
Original language | English |
---|---|
Title of host publication | Long and Short Papers |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 2267-2277 |
Number of pages | 11 |
ISBN (Electronic) | 9781950737130 |
State | Published - 2019 |
Event | 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2019 - Minneapolis, United States Duration: 2 Jun 2019 → 7 Jun 2019 |
Publication series
Name | NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference |
---|---|
Volume | 1 |
Conference
Conference | 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2019 |
---|---|
Country/Territory | United States |
City | Minneapolis |
Period | 2/06/19 → 7/06/19 |
Bibliographical note
Publisher Copyright:© 2019 Association for Computational Linguistics
Funding
∗This research was supported in part by the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1) and by a grant from Theo Hoffenberg and Reverso.
Funders | Funder number |
---|---|
DIP | DA 1600/1-1 |
German-Israeli Project Cooperation | |
Deutsche Forschungsgemeinschaft |