3 research outputs found
Exponential Error Convergence in Data Classification with Optimized Random Features: Acceleration by Quantum Machine Learning
Random features are a central technique for scalable learning algorithms
based on kernel methods. A recent work has shown that an algorithm for machine
learning by quantum computer, quantum machine learning (QML), can exponentially
speed up sampling of optimized random features, even without imposing
restrictive assumptions on sparsity and low-rankness of matrices that had
limited applicability of conventional QML algorithms; this QML algorithm makes
it possible to significantly reduce and provably minimize the required number
of features for regression tasks. However, a major interest in the field of QML
is how widely the advantages of quantum computation can be exploited, not only
in the regression tasks. We here construct a QML algorithm for a classification
task accelerated by the optimized random features. We prove that the QML
algorithm for sampling optimized random features, combined with stochastic
gradient descent (SGD), can achieve state-of-the-art exponential convergence
speed of reducing classification error in a classification task under a
low-noise condition; at the same time, our algorithm with optimized random
features can take advantage of the significant reduction of the required number
of features so as to accelerate each iteration in the SGD and evaluation of the
classifier obtained from our algorithm. These results discover a promising
application of QML to significant acceleration of the leading classification
algorithm based on kernel methods, without ruining its applicability to a
practical class of data sets and the exponential error-convergence speed.Comment: 28 pages, no figur