About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NAACL 2021
Conference paper
AMR Parsing with Action-Pointer Transformer
Abstract
Abstract Meaning Representation parsing belongs to a category of sentence-to-graph prediction tasks where the target graph is not explicitly linked to the sentence tokens. However, nodes or subgraphs are semantically related to subsets of the sentence tokens, and locality between words and related nodes is often preserved. Transition-based approaches have recently shown great progress in capturing these inductive biases but still suffer from limited expressiveness. In this work, we propose a transition-based system that combines hard-attention over sentences with a target-side action pointer mechanism to decouple source tokens from node representations. We model the transitions as well as the pointer mechanism using a single Transformer model. Parser state and graph structure information is efficiently encoded using attention heads. We show that our approach leads to increased expressiveness while capitalizing inductive biases and attains new state-of-the Smatch scores on AMR 1.0 (78.5) and AMR 2.0 (81.8).