54,028 research outputs found
Quantum machine learning: a classical perspective
Recently, increased computational power and data availability, as well as
algorithmic advances, have led machine learning techniques to impressive
results in regression, classification, data-generation and reinforcement
learning tasks. Despite these successes, the proximity to the physical limits
of chip fabrication alongside the increasing size of datasets are motivating a
growing number of researchers to explore the possibility of harnessing the
power of quantum computation to speed-up classical machine learning algorithms.
Here we review the literature in quantum machine learning and discuss
perspectives for a mixed readership of classical machine learning and quantum
computation experts. Particular emphasis will be placed on clarifying the
limitations of quantum algorithms, how they compare with their best classical
counterparts and why quantum resources are expected to provide advantages for
learning problems. Learning in the presence of noise and certain
computationally hard problems in machine learning are identified as promising
directions for the field. Practical questions, like how to upload classical
data into quantum form, will also be addressed.Comment: v3 33 pages; typos corrected and references adde
Quantum Machine Learning For Classical Data
In this dissertation, we study the intersection of quantum computing and supervised machine learning algorithms, which means that we investigate quantum algorithms for supervised machine learning that operate on classical data. This area of re- search falls under the umbrella of quantum machine learning, a research area of computer science which has recently received wide attention. In particular, we in- vestigate to what extent quantum computers can be used to accelerate supervised machine learning algorithms. The aim of this is to develop a clear understanding of the promises and limitations of the current state-of-the-art of quantum algorithms for supervised machine learning, but also to define directions for future research in this exciting field. We start by looking at supervised quantum machine learning (QML) algorithms through the lens of statistical learning theory. In this frame- work, we derive novel bounds on the computational complexities of a large set of supervised QML algorithms under the requirement of optimal learning rates. Next, we give a new bound for Hamiltonian simulation of dense Hamiltonians, a major subroutine of most known supervised QML algorithms, and then derive a classical algorithm with nearly the same complexity. We then draw the parallels to recent ‘quantum-inspired’ results, and will explain the implications of these results for quantum machine learning applications. Looking for areas which might bear larger advantages for QML algorithms, we finally propose a novel algorithm for Quantum Boltzmann machines, and argue that quantum algorithms for quantum data are one of the most promising applications for QML with potentially exponential advantage over classical approaches
Towards provably efficient quantum algorithms for large-scale machine-learning models
Large machine learning models are revolutionary technologies of artificial
intelligence whose bottlenecks include huge computational expenses, power, and
time used both in the pre-training and fine-tuning process. In this work, we
show that fault-tolerant quantum computing could possibly provide provably
efficient resolutions for generic (stochastic) gradient descent algorithms,
scaling as , where is the size
of the models and is the number of iterations in the training, as long as
the models are both sufficiently dissipative and sparse, with small learning
rates. Based on earlier efficient quantum algorithms for dissipative
differential equations, we find and prove that similar algorithms work for
(stochastic) gradient descent, the primary algorithm for machine learning. In
practice, we benchmark instances of large machine learning models from 7
million to 103 million parameters. We find that, in the context of sparse
training, a quantum enhancement is possible at the early stage of learning
after model pruning, motivating a sparse parameter download and re-upload
scheme. Our work shows solidly that fault-tolerant quantum algorithms could
potentially contribute to most state-of-the-art, large-scale machine-learning
problems.Comment: 7+30 pages, 3+5 figure
UR-363 Quantum Machine Learning Applied to Cybersecurity
We propose the development of a system that uses the TensorFlow Quantum and PennyLane packages and applies quantum machine learning (QML) algorithms to process various security and malicious data sets and compares the performance with classical machine learning (CML) algorithms. One of the most important applications of QML is for cybersecurity. This project will begin with research of quantum computing and machine learning, then followed by the development of a system that uses the TensorFlow Quantum and PennyLane packages and applies quantum machine learning (QML) algorithms to process various security and malicious data sets and compares the performance with classical machine learning (CML) algorithms.The data sets for our modules include DDoS prevention, malware detection, user behavior anomaly detection, and spam email filtering. We provide detailed instructions for program implementation on our project website in order to better proliferate quantum programming in order to encourage others to explore quantum algorithms
- …