1,603 research outputs found

    Quantum-inspired low-rank stochastic regression with logarithmic dependence on the dimension

    Get PDF
    We construct an efficient classical analogue of the quantum matrix inversion algorithm (HHL) for low-rank matrices. Inspired by recent work of Tang, assuming length-square sampling access to input data, we implement the pseudoinverse of a low-rank matrix and sample from the solution to the problem Ax=bAx=b using fast sampling techniques. We implement the pseudo-inverse by finding an approximate singular value decomposition of AA via subsampling, then inverting the singular values. In principle, the approach can also be used to apply any desired "smooth" function to the singular values. Since many quantum algorithms can be expressed as a singular value transformation problem, our result suggests that more low-rank quantum algorithms can be effectively "dequantised" into classical length-square sampling algorithms.Comment: 10 page

    Quantum-inspired low-rank stochastic regression with logarithmic dependence on the dimension

    Get PDF
    We construct an efficient classical analogue of the quantum matrix inversion algorithm [HHL09] for low-rank matrices. Inspired by recent work of Tang [Tan18a], assuming length-square sampling access to input data, we implement the pseudo-inverse of a low-rank matrix and sample from the solution to the problem Ax = b using fast sampling techniques. We implement th

    An improved quantum-inspired algorithm for linear regression

    Get PDF
    We give a classical algorithm for linear regression analogous to the quantum matrix inversion algorithm [Harrow, Hassidim, and Lloyd, Physical Review Letters'09] for low-rank matrices [Wossnig et al., Physical Review Letters'18], when the input matrix AA is stored in a data structure applicable for QRAM-based state preparation. Namely, given an ACm×nA \in \mathbb{C}^{m\times n} with minimum singular value σ\sigma and which supports certain efficient 2\ell_2-norm importance sampling queries, along with a bCmb \in \mathbb{C}^m, we can output a description of an xCnx \in \mathbb{C}^n such that xA+bεA+b\|x - A^+b\| \leq \varepsilon\|A^+b\| in O~(AF6A2σ8ε4)\tilde{\mathcal{O}}\Big(\frac{\|A\|_{\mathrm{F}}^6\|A\|^2}{\sigma^8\varepsilon^4}\Big) time, improving on previous "quantum-inspired" algorithms in this line of research by a factor of A14σ14ε2\frac{\|A\|^{14}}{\sigma^{14}\varepsilon^2} [Chia et al., STOC'20]. The algorithm is stochastic gradient descent, and the analysis bears similarities to those of optimization algorithms for regression in the usual setting [Gupta and Sidford, NeurIPS'18]. Unlike earlier works, this is a promising avenue that could lead to feasible implementations of classical regression in a quantum-inspired setting, for comparison against future quantum computers.Comment: 16 pages, bug fixe

    Quantum machine learning: a classical perspective

    Get PDF
    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets are motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed-up classical machine learning algorithms. Here we review the literature in quantum machine learning and discuss perspectives for a mixed readership of classical machine learning and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in machine learning are identified as promising directions for the field. Practical questions, like how to upload classical data into quantum form, will also be addressed.Comment: v3 33 pages; typos corrected and references adde

    Quantum-Inspired Algorithms for Solving Low-Rank Linear Equation Systems with Logarithmic Dependence on the Dimension

    Get PDF
    We present two efficient classical analogues of the quantum matrix inversion algorithm [16] for low-rank matrices. Inspired by recent work of Tang [27], assuming length-square sampling access to input data, we implement the pseudoinverse of a low-rank matrix allowing us to sample from the solution to the problem Ax = b using fast sampling techniques. We construct implicit descriptions of the pseudo-inverse by finding approximate singular value decomposition of A via subsampling, then inverting the singular values. In principle, our approaches can also be used to apply any desired “smooth” function to the singular values. Since many quantum algorithms can be expressed as a singular value transformation problem [15], our results indicate that more low-rank quantum algorithms can be effectively “dequantised” into classical length-square sampling algorithms

    Quantum computing for finance

    Full text link
    Quantum computers are expected to surpass the computational capabilities of classical computers and have a transformative impact on numerous industry sectors. We present a comprehensive summary of the state of the art of quantum computing for financial applications, with particular emphasis on stochastic modeling, optimization, and machine learning. This Review is aimed at physicists, so it outlines the classical techniques used by the financial industry and discusses the potential advantages and limitations of quantum techniques. Finally, we look at the challenges that physicists could help tackle

    Quantum-Inspired Support Vector Machine

    Full text link
    Support vector machine (SVM) is a particularly powerful and flexible supervised learning model that analyzes data for both classification and regression, whose usual algorithm complexity scales polynomially with the dimension of data space and the number of data points. To tackle the big data challenge, a quantum SVM algorithm was proposed, which is claimed to achieve exponential speedup for least squares SVM (LS-SVM). Here, inspired by the quantum SVM algorithm, we present a quantum-inspired classical algorithm for LS-SVM. In our approach, a improved fast sampling technique, namely indirect sampling, is proposed for sampling the kernel matrix and classifying. We first consider the LS-SVM with a linear kernel, and then discuss the generalization of our method to non-linear kernels. Theoretical analysis shows our algorithm can make classification with arbitrary success probability in logarithmic runtime of both the dimension of data space and the number of data points for low rank, low condition number and high dimensional data matrix, matching the runtime of the quantum SVM
    corecore