Abstract
Quantum machine learning algorithms based on parameterized quantum circuits are promising candidates for near-term quantum advantage. Although these algorithms are compatible with the current generation of quantum processors, device noise limits their performance, for example by inducing an exponential flattening of loss landscapes. Error suppression schemes such as dynamical decoupling and Pauli twirling alleviate this issue by reducing noise at the hardware level. A recent addition to this toolbox of techniques is pulse-efficient transpilation, which reduces circuit schedule duration by exploiting hardware-native cross-resonance interaction. In this work, we investigate the impact of pulse-efficient circuits on near-term algorithms for quantum machine learning. We report results for two standard experiments: binary classification on a synthetic dataset with quantum neural networks and handwritten digit recognition with quantum kernel estimation. In both cases, we find that pulse-efficient transpilation vastly reduces average circuit durations and, as a result, significantly improves classification accuracy. We conclude by applying pulse-efficient transpilation to the Hamiltonian Variational Ansatz and show that it delays the onset of noiseinduced barren plateaus.