20 research outputs found

    No imminent quantum supremacy by boson sampling

    Get PDF
    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom

    The Classical Complexity of Boson Sampling

    Get PDF
    We study the classical complexity of the exact Boson Sampling problem where the objective is to produce provably correct random samples from a particular quantum mechanical distribution. The computational framework was proposed by Aaronson and Arkhipov in 2011 as an attainable demonstration of `quantum supremacy', that is a practical quantum computing experiment able to produce output at a speed beyond the reach of classical (that is non-quantum) computer hardware. Since its introduction Boson Sampling has been the subject of intense international research in the world of quantum computing. On the face of it, the problem is challenging for classical computation. Aaronson and Arkhipov show that exact Boson Sampling is not efficiently solvable by a classical computer unless P#P=BPPNPP^{\#P} = BPP^{NP} and the polynomial hierarchy collapses to the third level. The fastest known exact classical algorithm for the standard Boson Sampling problem takes O((m+n1n)n2n)O({m + n -1 \choose n} n 2^n ) time to produce samples for a system with input size nn and mm output modes, making it infeasible for anything but the smallest values of nn and mm. We give an algorithm that is much faster, running in O(n2n+poly(m,n))O(n 2^n + \operatorname{poly}(m,n)) time and O(m)O(m) additional space. The algorithm is simple to implement and has low constant factor overheads. As a consequence our classical algorithm is able to solve the exact Boson Sampling problem for system sizes far beyond current photonic quantum computing experimentation, thereby significantly reducing the likelihood of achieving near-term quantum supremacy in the context of Boson Sampling.Comment: 15 pages. To appear in SODA '1

    A faster hafnian formula for complex matrices and its benchmarking on a supercomputer

    Full text link
    We introduce new and simple algorithms for the calculation of the number of perfect matchings of complex weighted, undirected graphs with and without loops. Our compact formulas for the hafnian and loop hafnian of n×nn \times n complex matrices run in O(n32n/2)O(n^3 2^{n/2}) time, are embarrassingly parallelizable and, to the best of our knowledge, are the fastest exact algorithms to compute these quantities. Despite our highly optimized algorithm, numerical benchmarks on the Titan supercomputer with matrices up to size 56×5656 \times 56 indicate that one would require the 288000 CPUs of this machine for about a month and a half to compute the hafnian of a 100×100100 \times 100 matrix.Comment: 11 pages, 7 figures. The source code of the library is available at https://github.com/XanaduAI/hafnian . Accepted for publication in Journal of Experimental Algorithmic

    Validating multi-photon quantum interference with finite data

    Full text link
    Multi-particle interference is a key resource for quantum information processing, as exemplified by Boson Sampling. Hence, given its fragile nature, an essential desideratum is a solid and reliable framework for its validation. However, while several protocols have been introduced to this end, the approach is still fragmented and fails to build a big picture for future developments. In this work, we propose an operational approach to validation that encompasses and strengthens the state of the art for these protocols. To this end, we consider the Bayesian hypothesis testing and the statistical benchmark as most favorable protocols for small- and large-scale applications, respectively. We numerically investigate their operation with finite sample size, extending previous tests to larger dimensions, and against two adversarial algorithms for classical simulation: the Mean-Field sampler and the Metropolized Independent Sampler. To evidence the actual need for refined validation techniques, we show how the assessment of numerically simulated data depends on the available sample size, as well as on the internal hyper-parameters and other practically relevant constraints. Our analyses provide general insights into the challenge of validation, and can inspire the design of algorithms with a measurable quantum advantage.Comment: 10 pages, 7 figure

    High performance Boson Sampling simulation via data-flow engines

    Full text link
    In this work, we generalize the Balasubramanian-Bax-Franklin-Glynn (BB/FG) permanent formula to account for row multiplicities during the permanent evaluation and reduce the complexity of permanent evaluation in scenarios where such multiplicities occur. This is achieved by incorporating n-ary Gray code ordering of the addends during the evaluation. We implemented the designed algorithm on FPGA-based data-flow engines and utilized the developed accessory to speed up boson sampling simulations up to 4040 photons, by drawing samples from a 6060 mode interferometer at an averaged rate of 80\sim80 seconds per sample utilizing 44 FPGA chips. We also show that the performance of our BS simulator is in line with the theoretical estimation of Clifford \& Clifford \cite{clifford2020faster} providing a way to define a single parameter to characterize the performance of the BS simulator in a portable way. The developed design can be used to simulate both ideal and lossy boson sampling experiments.Comment: 25 page

    Validating multi-photon quantum interference with finite data

    Get PDF
    Multi-particle interference is a key resource for quantum information processing, as exemplified by Boson Sampling. Hence, given its fragile nature, an essential desideratum is a solid and reliable framework for its validation. However, while several protocols have been introduced to this end, the approach is still fragmented and fails to build a big picture for future developments. In this work, we propose an operational approach to validation that encompasses and strengthens the state of the art for these protocols. To this end, we consider the Bayesian hypothesis testing and the statistical benchmark as most favorable protocols for small- and large-scale applications, respectively. We numerically investigate their operation with finite sample size, extending previous tests to larger dimensions, and against two adversarial algorithms for classical simulation: the mean-field sampler and the metropolized independent sampler. To evidence the actual need for refined validation techniques, we show how the assessment of numerically simulated data depends on the available sample size, as well as on the internal hyper-parameters and other practically relevant constraints. Our analyses provide general insights into the challenge of validation, and can inspire the design of algorithms with a measurable quantum advantage
    corecore