6,548 research outputs found

    Pseudorandomness and the Minimum Circuit Size Problem

    Get PDF

    The Road to Quantum Computational Supremacy

    Full text link
    We present an idiosyncratic view of the race for quantum computational supremacy. Google's approach and IBM challenge are examined. An unexpected side-effect of the race is the significant progress in designing fast classical algorithms. Quantum supremacy, if achieved, won't make classical computing obsolete.Comment: 15 pages, 1 figur

    Direct certification of a class of quantum simulations

    Get PDF
    One of the main challenges in the field of quantum simulation and computation is to identify ways to certify the correct functioning of a device when a classical efficient simulation is not available. Important cases are situations in which one cannot classically calculate local expectation values of state preparations efficiently. In this work, we develop weak-membership formulations of the certification of ground state preparations. We provide a non-interactive protocol for certifying ground states of frustration-free Hamiltonians based on simple energy measurements of local Hamiltonian terms. This certification protocol can be applied to classically intractable analog quantum simulations: For example, using Feynman-Kitaev Hamiltonians, one can encode universal quantum computation in such ground states. Moreover, our certification protocol is applicable to ground states encodings of IQP circuits demonstration of quantum supremacy. These can be certified efficiently when the error is polynomially bounded.Comment: 10 pages, corrected a small error in Eqs. (2) and (5

    The Limits of Post-Selection Generalization

    Full text link
    While statistics and machine learning offers numerous methods for ensuring generalization, these methods often fail in the presence of adaptivity---the common practice in which the choice of analysis depends on previous interactions with the same dataset. A recent line of work has introduced powerful, general purpose algorithms that ensure post hoc generalization (also called robust or post-selection generalization), which says that, given the output of the algorithm, it is hard to find any statistic for which the data differs significantly from the population it came from. In this work we show several limitations on the power of algorithms satisfying post hoc generalization. First, we show a tight lower bound on the error of any algorithm that satisfies post hoc generalization and answers adaptively chosen statistical queries, showing a strong barrier to progress in post selection data analysis. Second, we show that post hoc generalization is not closed under composition, despite many examples of such algorithms exhibiting strong composition properties

    An Atypical Survey of Typical-Case Heuristic Algorithms

    Full text link
    Heuristic approaches often do so well that they seem to pretty much always give the right answer. How close can heuristic algorithms get to always giving the right answer, without inducing seismic complexity-theoretic consequences? This article first discusses how a series of results by Berman, Buhrman, Hartmanis, Homer, Longpr\'{e}, Ogiwara, Sch\"{o}ening, and Watanabe, from the early 1970s through the early 1990s, explicitly or implicitly limited how well heuristic algorithms can do on NP-hard problems. In particular, many desirable levels of heuristic success cannot be obtained unless severe, highly unlikely complexity class collapses occur. Second, we survey work initiated by Goldreich and Wigderson, who showed how under plausible assumptions deterministic heuristics for randomized computation can achieve a very high frequency of correctness. Finally, we consider formal ways in which theory can help explain the effectiveness of heuristics that solve NP-hard problems in practice.Comment: This article is currently scheduled to appear in the December 2012 issue of SIGACT New

    On Statistical Query Sampling and NMR Quantum Computing

    Full text link
    We introduce a ``Statistical Query Sampling'' model, in which the goal of an algorithm is to produce an element in a hidden set SsubseteqbitnSsubseteqbit^n with reasonable probability. The algorithm gains information about SS through oracle calls (statistical queries), where the algorithm submits a query function g(cdot)g(cdot) and receives an approximation to PrxinS[g(x)=1]Pr_{x in S}[g(x)=1]. We show how this model is related to NMR quantum computing, in which only statistical properties of an ensemble of quantum systems can be measured, and in particular to the question of whether one can translate standard quantum algorithms to the NMR setting without putting all of their classical post-processing into the quantum system. Using Fourier analysis techniques developed in the related context of {em statistical query learning}, we prove a number of lower bounds (both information-theoretic and cryptographic) on the ability of algorithms to produces an xinSxin S, even when the set SS is fairly simple. These lower bounds point out a difficulty in efficiently applying NMR quantum computing to algorithms such as Shor's and Simon's algorithm that involve significant classical post-processing. We also explicitly relate the notion of statistical query sampling to that of statistical query learning. An extended abstract appeared in the 18th Aunnual IEEE Conference of Computational Complexity (CCC 2003), 2003. Keywords: statistical query, NMR quantum computing, lower boundComment: 17 pages, no figures. Appeared in 18th Aunnual IEEE Conference of Computational Complexity (CCC 2003

    Learning hard quantum distributions with variational autoencoders

    Get PDF
    Studying general quantum many-body systems is one of the major challenges in modern physics because it requires an amount of computational resources that scales exponentially with the size of the system.Simulating the evolution of a state, or even storing its description, rapidly becomes intractable for exact classical algorithms. Recently, machine learning techniques, in the form of restricted Boltzmann machines, have been proposed as a way to efficiently represent certain quantum states with applications in state tomography and ground state estimation. Here, we introduce a new representation of states based on variational autoencoders. Variational autoencoders are a type of generative model in the form of a neural network. We probe the power of this representation by encoding probability distributions associated with states from different classes. Our simulations show that deep networks give a better representation for states that are hard to sample from, while providing no benefit for random states. This suggests that the probability distributions associated to hard quantum states might have a compositional structure that can be exploited by layered neural networks. Specifically, we consider the learnability of a class of quantum states introduced by Fefferman and Umans. Such states are provably hard to sample for classical computers, but not for quantum ones, under plausible computational complexity assumptions. The good level of compression achieved for hard states suggests these methods can be suitable for characterising states of the size expected in first generation quantum hardware.Comment: v2: 9 pages, 3 figures, journal version with major edits with respect to v1 (rewriting of section "hard and easy quantum states", extended discussion on comparison with tensor networks
    • …
    corecore