4,230 research outputs found

    Quantum query complexity of entropy estimation

    Full text link
    Estimation of Shannon and R\'enyi entropies of unknown discrete distributions is a fundamental problem in statistical property testing and an active research topic in both theoretical computer science and information theory. Tight bounds on the number of samples to estimate these entropies have been established in the classical setting, while little is known about their quantum counterparts. In this paper, we give the first quantum algorithms for estimating α\alpha-R\'enyi entropies (Shannon entropy being 1-Renyi entropy). In particular, we demonstrate a quadratic quantum speedup for Shannon entropy estimation and a generic quantum speedup for α\alpha-R\'enyi entropy estimation for all α≥0\alpha\geq 0, including a tight bound for the collision-entropy (2-R\'enyi entropy). We also provide quantum upper bounds for extreme cases such as the Hartley entropy (i.e., the logarithm of the support size of a distribution, corresponding to α=0\alpha=0) and the min-entropy case (i.e., α=+∞\alpha=+\infty), as well as the Kullback-Leibler divergence between two distributions. Moreover, we complement our results with quantum lower bounds on α\alpha-R\'enyi entropy estimation for all α≥0\alpha\geq 0.Comment: 43 pages, 1 figur

    Distributional Property Testing in a Quantum World

    Get PDF
    A fundamental problem in statistics and learning theory is to test properties of distributions. We show that quantum computers can solve such problems with significant speed-ups. We also introduce a novel access model for quantum distributions, enabling the coherent preparation of quantum samples, and propose a general framework that can naturally handle both classical and quantum distributions in a unified manner. Our framework generalizes and improves previous quantum algorithms for testing closeness between unknown distributions, testing independence between two distributions, and estimating the Shannon / von Neumann entropy of distributions. For classical distributions our algorithms significantly improve the precision dependence of some earlier results. We also show that in our framework procedures for classical distributions can be directly lifted to the more general case of quantum distributions, and thus obtain the first speed-ups for testing properties of density operators that can be accessed coherently rather than only via sampling

    Quantum and Classical Tradeoffs

    Get PDF
    We propose an approach for quantifying a quantum circuit's quantumness as a means to understand the nature of quantum algorithmic speedups. Since quantum gates that do not preserve the computational basis are necessary for achieving quantum speedups, it appears natural to define the quantumness of a quantum circuit using the number of such gates. Intuitively, a reduction in the quantumness requires an increase in the amount of classical computation, hence giving a ``quantum and classical tradeoff''. In this paper we present two results on this direction. The first gives an asymptotic answer to the question: ``what is the minimum number of non-basis-preserving gates required to generate a good approximation to a given state''. This question is the quantum analogy of the following classical question, ``how many fair coins are needed to generate a given probability distribution'', which was studied and resolved by Knuth and Yao in 1976. Our second result shows that any quantum algorithm that solves Grover's Problem of size n using k queries and l levels of non-basis-preserving gates must have k*l=\Omega(n)

    Quantum algorithms for testing properties of distributions

    Get PDF
    Suppose one has access to oracles generating samples from two unknown probability distributions P and Q on some N-element set. How many samples does one need to test whether the two distributions are close or far from each other in the L_1-norm ? This and related questions have been extensively studied during the last years in the field of property testing. In the present paper we study quantum algorithms for testing properties of distributions. It is shown that the L_1-distance between P and Q can be estimated with a constant precision using approximately N^{1/2} queries in the quantum settings, whereas classical computers need \Omega(N) queries. We also describe quantum algorithms for testing Uniformity and Orthogonality with query complexity O(N^{1/3}). The classical query complexity of these problems is known to be \Omega(N^{1/2}).Comment: 20 page

    Exponential Quantum Speed-ups are Generic

    Get PDF
    A central problem in quantum computation is to understand which quantum circuits are useful for exponential speed-ups over classical computation. We address this question in the setting of query complexity and show that for almost any sufficiently long quantum circuit one can construct a black-box problem which is solved by the circuit with a constant number of quantum queries, but which requires exponentially many classical queries, even if the classical machine has the ability to postselect. We prove the result in two steps. In the first, we show that almost any element of an approximate unitary 3-design is useful to solve a certain black-box problem efficiently. The problem is based on a recent oracle construction of Aaronson and gives an exponential separation between quantum and classical bounded-error with postselection query complexities. In the second step, which may be of independent interest, we prove that linear-sized random quantum circuits give an approximate unitary 3-design. The key ingredient in the proof is a technique from quantum many-body theory to lower bound the spectral gap of local quantum Hamiltonians.Comment: 24 pages. v2 minor correction

    New summing algorithm using ensemble computing

    Full text link
    We propose an ensemble algorithm, which provides a new approach for evaluating and summing up a set of function samples. The proposed algorithm is not a quantum algorithm, insofar it does not involve quantum entanglement. The query complexity of the algorithm depends only on the scaling of the measurement sensitivity with the number of distinct spin sub-ensembles. From a practical point of view, the proposed algorithm may result in an exponential speedup, compared to known quantum and classical summing algorithms. However in general, this advantage exists only if the total number of function samples is below a threshold value which depends on the measurement sensitivity.Comment: 13 pages, 0 figures, VIth International Conference on Quantum Communication, Measurement and Computing (Boston, 2002
    • …
    corecore