42 research outputs found

    Spontaneous parametric down-conversion photon sources are scalable in the asymptotic limit for boson sampling

    Full text link
    Boson sampling has emerged as a promising avenue towards postclassical optical quantum computation, and numerous elementary demonstrations have recently been performed. Spontaneous parametric down-conversion (SPDC) is the mainstay for single-photon state preparation, the technique employed in most optical quantum information processing implementations to date. Here we present a simple architecture for boson sampling based on multiplexed SPDC sources and demonstrate that the architecture is limited only by the postselection detection efficiency assuming that other errors, such as spectral impurity, dark counts, and interferometric instability, are negligible. For any given number of input photons, there exists a minimum detector efficiency that allows postselection. If this efficiency is achieved, photon-number errors in the SPDC sources are sufficiently low as to guarantee correct boson sampling most of the time. In this scheme, the required detector efficiency must increase exponentially in the photon number. Thus, we show that idealized SPDC sources will not present a bottleneck for future boson-sampling implementations. Rather, photodetection efficiency is the limiting factor, and thus, future implementations may continue to employ SPDC sources. © 2013 American Physical Society

    An introduction to Boson-sampling

    Full text link
    © 2010 by World Scientific Publishing Co. Pte. Ltd. All rights reserved. Boson-sampling is a simplified model for quantum computing that may hold the key to implementing the first ever post-classical quantum computer. Boson-sampling is a non-universal quantum computer that is significantly more straightforward to build than any universal quantum computer proposed so far. We begin this chapter by motivating boson-sampling and discussing the history of linear optics quantum computing. We then summarize the boson-sampling formalism, discuss what a sampling problem is, explain why boson-sampling is easier than linear optics quantum computing, and discuss the Extended Church-Turing thesis. Next, sampling with other classes of quantum optical states is analyzed. Finally, we discuss the feasibility of building a boson-sampling device using existing technology

    Linear Optical Quantum Metrology with Single Photons: Exploiting Spontaneously Generated Entanglement to Beat the Shot-Noise Limit

    Full text link
    © 2015 American Physical Society. Quantum number-path entanglement is a resource for supersensitive quantum metrology and in particular provides for sub-shot-noise or even Heisenberg-limited sensitivity. However, such number-path entanglement has been thought to be resource intensive to create in the first place - typically requiring either very strong nonlinearities, or nondeterministic preparation schemes with feedforward, which are difficult to implement. Very recently, arising from the study of quantum random walks with multiphoton walkers, as well as the study of the computational complexity of passive linear optical interferometers fed with single-photon inputs, it has been shown that such passive linear optical devices generate a superexponentially large amount of number-path entanglement. A logical question to ask is whether this entanglement may be exploited for quantum metrology. We answer that question here in the affirmative by showing that a simple, passive, linear-optical interferometer - fed with only uncorrelated, single-photon inputs, coupled with simple, single-mode, disjoint photodetection - is capable of significantly beating the shot-noise limit. Our result implies a pathway forward to practical quantum metrology with readily available technology

    Linear optical quantum metrology with single photons: Experimental errors, resource counting, and quantum Cramér-Rao bounds

    Get PDF
    © 2017 American Physical Society. Quantum number-path entanglement is a resource for supersensitive quantum metrology and in particular provides for sub-shot-noise or even Heisenberg-limited sensitivity. However, such number-path entanglement is thought to have been resource intensive to create in the first place, typically requiring either very strong nonlinearities or nondeterministic preparation schemes with feedforward, which are difficult to implement. Recently [K. R. Motes, Phys. Rev. Lett. 114, 170802 (2015)PRLTAO0031-900710.1103/PhysRevLett.114.170802], it was shown that number-path entanglement from a BosonSampling inspired interferometer can be used to beat the shot-noise limit. In this paper we compare and contrast different interferometric schemes, discuss resource counting, calculate exact quantum Cramér-Rao bounds, and study details of experimental errors

    Efficient recycling strategies for preparing large Fock states from single-photon sources: Applications to quantum metrology

    Full text link
    © 2016 American Physical Society. Fock states are a fundamental resource for many quantum technologies such as quantum metrology. While much progress has been made in single-photon source technologies, preparing Fock states with a large photon number remains challenging. We present and analyze a bootstrapped approach for nondeterministically preparing large photon-number Fock states by iteratively fusing smaller Fock states on a beamsplitter. We show that by employing state recycling we are able to exponentially improve the preparation rate over conventional schemes, allowing the efficient preparation of large Fock states. The scheme requires single-photon sources, beamsplitters, number-resolved photodetectors, fast-feedforward, and an optical quantum memory

    No imminent quantum supremacy by boson sampling

    Get PDF
    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom

    The evolution of primate short-term memory

    Get PDF
    Short-term memory is implicated in a range of cognitive abilities and is critical for understanding primate cognitive evolution. To investigate the effects of phylogeny, ecology and sociality on short-term memory, we tested the largest and most diverse primate sample to date (421 non-human primates across 41 species) in an experimental delayed-response task. Our results confirm previous findings that longer delays decrease memory performance across species and taxa. Our analyses demonstrate a considerable contribution of phylogeny over ecological and social factors on the distribution of short-term memory performance in primates; closely related species had more similar short-term memory abilities. Overall, individuals in the branch of Hominoidea performed better compared to Cercopithecoidea, who in turn performed above Platyrrhini and Strepsirrhini. Interdependencies between phylogeny and socioecology of a given species presented an obstacle to disentangling the effects of each of these factors on the evolution of shortterm memory capacity. However, this study offers an important step forward in understanding the interspecies and individual variation in short-term memory ability by providing the first phylogenetic reconstruction of this trait’s evolutionary history. The dataset constitutes a unique resource for studying the evolution of primate cognition and the role of short-term memory in other cognitive abilities

    Will boson-sampling ever disprove the Extended Church-Turing thesis?

    Full text link
    Boson-sampling is a highly simplified, but non-universal, approach to implementing optical quantum computation. It was shown by Aaronson and Arkhipov that this protocol cannot be efficiently classically simulated unless the polynomial hierarchy collapses, which would be a shocking result in computational complexity theory. Based on this, numerous authors have made the claim that experimental boson-sampling would provide evidence against, or disprove, the Extended Church-Turing thesis -- that any physically realisable system can be efficiently simulated on a Turing machine. We argue against this claim on the basis that, under a general, physically realistic independent error model, boson-sampling does not implement a provably hard computational problem in the asymptotic limit of large systems

    Implementing BosonSampling with time-bin encoding: Analysis of loss, mode mismatch, and time jitter

    Full text link
    ©2015 American Physical Society. It was recently shown by Motes, Gilchrist, Dowling, and Rohde [Phys. Rev. Lett. 113, 120501 (2014)PRLTAO0031-900710.1103/PhysRevLett.113.120501] that a time-bin encoded fiber-loop architecture can implement an arbitrary passive linear optics transformation. This was shown in the case of an ideal scheme whereby the architecture has no sources of error. In any realistic implementation, however, physical errors are present, which corrupt the output of the transformation. We investigate the dominant sources of error in this architecture - loss and mode mismatch - and consider how it affects the BosonSampling protocol, a key application for passive linear optics. For our loss analysis we consider two major components that contribute to loss - fiber and switches - and calculate how this affects the success probability and fidelity of the device. Interestingly, we find that errors due to loss are not uniform (unique to time-bin encoding), which asymmetrically biases the implemented unitary. Thus loss necessarily limits the class of unitaries that may be implemented, and therefore future implementations must prioritize minimizing loss rates if arbitrary unitaries are to be implemented. Our formalism for mode mismatch is generalized to account for various phenomenon that may cause mode mismatch, but we focus on two - errors in fiber-loop lengths and time jitter of the photon source. These results provide a guideline for how well future experimental implementations might perform in light of these error mechanisms
    corecore