Topological Data Analysis on Noisy Quantum Computers
Ismail Akhalwaya, Shashanka Ubaru, et al.
ICLR 2024
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve per¬formance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This mo¬tivates research into efficient methods that require fewer resources to achieve similar re¬sults. This survey synthesizes and relates cur¬rent methods and findings in efficient NLP. We aim to provide both guidance for con-ducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
Ismail Akhalwaya, Shashanka Ubaru, et al.
ICLR 2024
Merve Unuvar, Yurdaer Doganata, et al.
CLOUD 2014
Salvatore Certo, Anh Pham, et al.
Quantum Machine Intelligence
Amy Lin, Sujit Roy, et al.
AGU 2024