About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
APS March Meeting 2020
Talk
A Hybrid Quantum-Classical Algorithm for Training Quantum Boltzmann Machines
Abstract
A Boltzmann Machine is a Machine Learning algorithm based on a measure from statistical mechanics, i.e. the Boltzmann distribution. The respective concepts can also be used with quantum computers which leads to Quantum Boltzmann Machines. These have the potential to outperform classical algorithms in a variety of learning problems and to enable classically intractable tasks, such as discriminative learning with quantum data and generative modeling of classically inaccessible structures. Several implementations have been suggested, but they all face practical difficulties. Firstly, it is often challenging to generate the quantum Gibbs state representing the Boltzmann distribution. Secondly, the training performance is usually impaired as the problem formulation requires that the system must be trained with an upper bound instead of the actual loss function. In this work, we use variational quantum Gibbs state preparation to enable gate-based Quantum Boltzmann Machines which can be trained with the true loss function. Moreover, the algorithm relies on a hybrid quantum-classical optimization algorithm and is, thus, compatible with near-term quantum computers. The applicability of this approach is demonstrated on illustrative examples.