Variational Bayes (VB) is a critical method in machine learning and
statistics, underpinning the recent success of Bayesian deep learning. The
natural gradient is an essential component of efficient VB estimation, but it
is prohibitively computationally expensive in high dimensions. We propose a
hybrid quantum-classical algorithm to improve the scaling properties of natural
gradient computation and make VB a truly computationally efficient method for
Bayesian inference in highdimensional settings. The algorithm leverages matrix
inversion from the linear systems algorithm by Harrow, Hassidim, and Lloyd
[Phys. Rev. Lett. 103, 15 (2009)] (HHL). We demonstrate that the matrix to be
inverted is sparse and the classical-quantum-classical handoffs are
sufficiently economical to preserve computational efficiency, making the
problem of natural gradient for VB an ideal application of HHL. We prove that,
under standard conditions, the VB algorithm with quantum natural gradient is
guaranteed to converge. Our regression-based natural gradient formulation is
also highly useful for classical VB