Publication
NeurIPS 2023
Conference paper

On the role of entanglement and statistics in learning

Abstract

In this work we make progress in understanding the relationship between learning models when given access to entangled measurements, separable measurements and statistical measurements in the quantum statistical query (\QSQ\QSQ) model. To this end, we show the following~results. \textbf{Entanglement versus separable measurements.} The goal here is to learn an unknown ff from the concept class \Ccf:\01n[k]\Cc\subseteq {f:\01^n\rightarrow [k]} given copies of 12nxx,f(x)\frac{1}{\sqrt{2^n}}\sum_x \ket{x,f(x)}. We show that, if TT copies suffice to learn ff using entangled measurements, then O(nT2)O(nT^2) copies suffice to learn ff using just separable measurements. %Additionally, we exhibit a concept class \Cc\Cc for which, in order to learn some \emph{property} of ff, the sample complexity of learning using entangled measurements is exponentially smaller than separable measurements. \textbf{Entangled versus statistical measurements} The goal here is to learn a function f\Ccf \in \Cc given access to separable measurements and statistical measurements. We exhibit a concept class \Cc\Cc based of degree-22 functions that gives an exponential separation between \QSQ\QSQ learning and quantum learning with entangled measurements (even in the presence of noise). This proves the ``quantum analogue" of the seminal result of Blum et al.~\cite{blum2003noise} that separates classical \SQ\SQ learning from classical \PAC\PAC learning with classification~noise. \textbf{\QSQ\QSQ lower bounds for learning states.} The main technical contribution is to introduce a quantum statistical query dimension (\QSDA\QSDA), which we use to give lower bounds on the \QSQ\QSQ complexity of learning. Using this, we prove exponential \QSQ\QSQ lower bounds for testing purity of quantum states, learning CCHL states, coset states of Abelian groups, degree-22 functions, planted bi-clique states and learning output states of Clifford circuits of depth \polylog(n)\polylog(n). \textbf{Further applications.} Using our \QSQ\QSQ lower bounds give an \emph{unconditional} separation between weak and strong error mitigation and prove lower bounds for learning distributions in the \QSQ\QSQ model. Prior works by Quek et al.~\cite{quek2022exponentially}, Hinsche et al.~\cite{hinsche2022single} and Neitner et al.~\cite{sweke23} proved the analogous results \emph{assuming} diagonal measurements and our work removes this~assumption.

Date

Publication

NeurIPS 2023