Shivashankar Subramanian, Ioana Baldini, et al.
IAAI 2020
Relation linking is a crucial component of Knowledge Base Question Answering systems. Existing systems use a wide variety of heuristics, or ensembles of multiple systems, heavily relying on the surface question text. However, the explicit semantic parse of the question is a rich source of relation information that is not taken advantage of. We propose a simple transformer-based neural model for relation linking that leverages the AMR semantic parse of a sentence. Our system significantly outperforms the state-of-the-art on 4 popular benchmark datasets. These are based on either DBpedia or Wikidata, demonstrating that our approach is effective across KGs.
Shivashankar Subramanian, Ioana Baldini, et al.
IAAI 2020
Kevin Gu, Eva Tuecke, et al.
ICML 2024
Gabriele Picco, Lam Thanh Hoang, et al.
EMNLP 2021
Sara Rosenthal, Pepa Atanasova, et al.
ACL-IJCNLP 2021