About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NeurIPS 2020
Workshop paper
Entity and Relation Linking is All You Need for KGQA
Abstract
Ever since the advent of the Semantic Web, Knowledge Graph Question Answering (KGQA) has been a niche growing subfield within AI and NLP. The task of KGQA comprises answering a Natural Language Question from using the knowledge facts present in a given RDF based Knowledge Graphs (KGs) such as DBpedia, etc. The task of entity/relation linking is a crucial sub-step while designing any KGQA solution. There are many off-the-shelf entity/relation linkers available, for example Falcon but majority of the successful KGQA solutions do not go for such off-the-shelf entity/relations linkers because precision and recall of these linkers are not very high and instead follow the semantic parsing style approaches where they use their custom-designed linkers to translate the given question into an intermediate logic-form representation which is then converted into SPARQL and executed on KG to obtain the answer. In this work, we undertake a systematic study to find whether it's possible to design an effective modular approach comprising off-the-shelf linker and seq2seq style deep neural models. Our aim is to train a deep-net based model which can predict the SPARQL query and answer for a given test question.