PAC Generalization via Invariant Representations
Advait Parulekar, Karthikeyan Shanmugam, et al.
ICML 2023
In recent years, a number of keyphrase generation (KPG) approaches were proposed consisting of complex model architectures, dedicated training paradigms and decoding strategies. In this work, we opt for simplicity and show how a commonly used seq2seq language model, BART, can be easily adapted to generate keyphrases from the text in a single batch computation using a simple training procedure. Empirical results on five benchmarks show that our approach is as good as the existing state-of-the-art KPG systems, but using a much simpler and easy to deploy framework.
Advait Parulekar, Karthikeyan Shanmugam, et al.
ICML 2023
Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
Trang H. Tran, Lam Nguyen, et al.
INFORMS 2022
Raúl Fernández Díaz, Lam Thanh Hoang, et al.
ACS Fall 2024