Revisiting Quantum Algorithms for Linear Regressions: Quadratic Speedups without Data-Dependent Parameters

Abstract

Linear regression is one of the most fundamental linear algebra problems. Given a dense matrix ARn×dA \in \mathbb{R}^{n \times d} and a vector bb, the goal is to find xx' such that Axb22(1+ϵ)minxAxb22 \| Ax' - b \|_2^2 \leq (1+\epsilon) \min_{x} \| A x - b \|_2^2 . The best classical algorithm takes O(nd)+poly(d/ϵ)O(nd) + \mathrm{poly}(d/\epsilon) time [Clarkson and Woodruff STOC 2013, Nelson and Nguyen FOCS 2013]. On the other hand, quantum linear regression algorithms can achieve exponential quantum speedups, as shown in [Wang Phys. Rev. A 96, 012335, Kerenidis and Prakash ITCS 2017, Chakraborty, Gily{\'e}n and Jeffery ICALP 2019]. However, the running times of these algorithms depend on some quantum linear algebra-related parameters, such as κ(A)\kappa(A), the condition number of AA. In this work, we develop a quantum algorithm that runs in O~(ϵ1nd1.5)+poly(d/ϵ)\widetilde{O}(\epsilon^{-1}\sqrt{n}d^{1.5}) + \mathrm{poly}(d/\epsilon) time. It provides a quadratic quantum speedup in nn over the classical lower bound without any dependence on data-dependent parameters. In addition, we also show our result can be generalized to multiple regression and ridge linear regression

    Similar works

    Full text

    thumbnail-image

    Available Versions