About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICML 2022
Workshop paper
PGT: a prompt based generative transformer for the patent domain
Abstract
Patents are a valuable source of knowledge, but drafting them is a time-consuming and expensive task. Methods that assist patent generation can provide a two-fold improvement as they can speed up the generation process and suggest to the inventor ideas and claims. Herein, influenced by recent advances in language modeling via multitask learning and prompt engineering, we present Patent Generative Transformer (PGT), a transformer-based language model trained to facilitate patent drafting. Specifically, the model supports three tasks: part-of-patent generation, text infilling, and patent coherence evaluation. PGT complements inventors and assures the fast and successful transition from their input to a coherent patent disclosure taking advantage of its multitasking nature. We show how the model outperforms a collection of task-specific baselines on relevant metrics. We further test the quality of the generated text via blind testing by subject matter experts. Finally, we explore a zero-shot extension of the model showing how to use PGT for generating domain-specific abstracts.