Linear regression is one of the most fundamental linear algebra problems.
Given a dense matrix A∈Rn×d and a vector b, the goal
is to find x′ such that
∥Ax′−b∥22≤(1+ϵ)minx∥Ax−b∥22. The best
classical algorithm takes O(nd)+poly(d/ϵ) time [Clarkson
and Woodruff STOC 2013, Nelson and Nguyen FOCS 2013]. On the other hand,
quantum linear regression algorithms can achieve exponential quantum speedups,
as shown in [Wang Phys. Rev. A 96, 012335, Kerenidis and Prakash ITCS 2017,
Chakraborty, Gily{\'e}n and Jeffery ICALP 2019]. However, the running times of
these algorithms depend on some quantum linear algebra-related parameters, such
as κ(A), the condition number of A. In this work, we develop a quantum
algorithm that runs in O(ϵ−1nd1.5)+poly(d/ϵ) time. It provides a quadratic quantum speedup in n
over the classical lower bound without any dependence on data-dependent
parameters. In addition, we also show our result can be generalized to multiple
regression and ridge linear regression