Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Abstract Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs. Existing approaches to generating text from AMR have focused on training sequence-to-sequence or graph-to-sequence models on AMR annotated data only. In this paper, we propose an alternative approach that combines a strong pre-trained language model with cycle consistency-based re-scoring. Despite the simplicity of the approach, our experimental results show these models outperform all previous techniques, including the recent use of transformer architectures. In addition to the standard evaluation metrics, we provide human evaluation experiments that further substantiate the strength of our approach.
Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Robert Farrell, Rajarshi Das, et al.
AAAI-SS 2010
Ben Fei, Jinbai Liu
IEEE Transactions on Neural Networks
Michael Glass, Alfio Gliozzo, et al.
ACL 2020