220 research outputs found

    Long-Time Correlations in Single-Neutron Interferometry Data

    Get PDF
    We present a detailed analysis of the time series of time-stamped neutron counts obtained by single-neutron interferometry. The neutron counting statistics display the usual Poissonian behavior, but the variance of the neutron counts does not. Instead, the variance is found to exhibit a dependence on the phase-shifter setting which can be explained by a probabilistic model that accounts for fluctuations of the phase shift. The time series of the detection events exhibit long-time correlations with amplitudes that also depend on the phase-shifter setting. These correlations appear as damped oscillations with a period of about 2.8 s. By simulation, we show that the correlations of the time differences observed in the experiment can be reproduced by assuming that, for a fixed setting of the phase shifter, the phase shift experienced by the neutrons varies periodically in time with a period of 2.8 s. The same simulations also reproduce the behavior of the variance. Our analysis of the experimental data suggests that time-stamped data of singleparticle interference experiments may exhibit transient features that require a description in terms of non-stationary processes, going beyond the standard quantum model of independent random events

    Large-Scale Simulation of Shor's Quantum Factoring Algorithm

    Get PDF
    Shor's factoring algorithm is one of the most anticipated applications of quantum computing. However, the limited capabilities of today's quantum computers only permit a study of Shor's algorithm for very small numbers. Here we show how large GPU-based supercomputers can be used to assess the performance of Shor's algorithm for numbers that are out of reach for current and near-term quantum hardware. First, we study Shor's original factoring algorithm. While theoretical bounds suggest success probabilities of only 3-4 %, we find average success probabilities above 50 %, due to a high frequency of "lucky" cases, defined as successful factorizations despite unmet sufficient conditions. Second, we investigate a powerful post-processing procedure, by which the success probability can be brought arbitrarily close to one, with only a single run of Shor's quantum algorithm. Finally, we study the effectiveness of this post-processing procedure in the presence of typical errors in quantum processing hardware. We find that the quantum factoring algorithm exhibits a particular form of universality and resilience against the different types of errors. The largest semiprime that we have factored by executing Shor's algorithm on a GPU-based supercomputer, without exploiting prior knowledge of the solution, is 549755813701 = 712321 * 771781. We put forward the challenge of factoring, without oversimplification, a non-trivial semiprime larger than this number on any quantum computing device.Comment: differs from the published version in formatting and style; open source code available at https://jugit.fz-juelich.de/qip/shorgp

    Fragility of gate-error metrics in simulation models of flux-tunable transmon quantum computers

    Get PDF
    Constructing a quantum computer requires immensely precise control over a quantum system. A lack of precision is often quantified by gate-error metrics, such as the average infidelity or the diamond distance. However, usually such gate-error metrics are only considered for individual gates and not the errors that accumulate over consecutive gates. Furthermore, it is not well known how susceptible the metrics are to the assumptions which make up the model. Here we investigate these issues using realistic simulation models of quantum computers with flux-tunable transmons and coupling resonators. Our main findings reveal that (i) gate-error metrics are indeed affected by the many assumptions of the model, (ii) consecutive gate errors do not accumulate linearly, and (iii) gate-error metrics are poor predictors for the performance of consecutive gates. Additionally, we discuss a potential limitation in the scalability of the studied device architecture.</p

    New multiplexing scheme for monitoring fiber optic Bragg grating sensors in the coherence domain

    No full text
    A new multiplexing scheme for monitoring fiber optic Bragg gratings in the coherence domain has been developed. Grating pairs with different grating distances are distributed along a fiber line, and interference between their reflections is monitored with a scanning Michelson interferometer. The Bragg wavelength of the individual sensor elements is determined from the interference signal frequency

    Gate-error analysis in simulations of quantum computers with transmon qubits

    Get PDF
    In the model of gate-based quantum computation, the qubits are controlled by a sequence of quantum gates. In superconducting qubit systems, these gates can be implemented by voltage pulses. The success of implementing a particular gate can be expressed by various metrics such as the average gate fidelity, the diamond distance, and the unitarity. We analyze these metrics of gate pulses for a system of two superconducting transmon qubits coupled by a resonator, a system inspired by the architecture of the IBM Quantum Experience. The metrics are obtained by numerical solution of the time-dependent Schr\"odinger equation of the transmon system. We find that the metrics reflect systematic errors that are most pronounced for echoed cross-resonance gates, but that none of the studied metrics can reliably predict the performance of a gate when used repeatedly in a quantum algorithm

    Random State Technology

    Get PDF
    We review and extend, in a self-contained way, the mathematical foundations of numerical simulation methods that are based on the use of random states. The power and versatility of this simulation technology is illustrated by calculations of physically relevant properties such as the density of states of large single particle systems, the specific heat, current-current correlations, density-density correlations, and electron spin resonance spectra of many-body systems. We explore a new field of applications of the random state technology by showing that it can be used to analyze numerical simulations and experiments that aim to realize quantum supremacy on a noisy intermediate-scale quantum processor. Additionally, we show that concepts of the random state technology prove useful in quantum information theory

    Benchmarking gate-based quantum computers

    Get PDF
    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.Comment: Accepted for publication in Computer Physics Communication

    Benchmarking Advantage and D-Wave 2000Q quantum annealers with exact cover problems

    Get PDF
    We benchmark the quantum processing units of the largest quantum annealers to date, the 5000+ qubit quantum annealer Advantage and its 2000+ qubit predecessor D-Wave 2000Q, using tail assignment and exact cover problems from aircraft scheduling scenarios. The benchmark set contains small, intermediate, and large problems with both sparsely connected and almost fully connected instances. We find that Advantage outperforms D-Wave 2000Q for almost all problems, with a notable increase in success rate and problem size. In particular, Advantage is also able to solve the largest problems with 120 logical qubits that D-Wave 2000Q cannot solve anymore. Furthermore, problems that can still be solved by D-Wave 2000Q are solved faster by Advantage. We find, however, that D-Wave 2000Q can achieve better success rates for sparsely connected problems that do not require the many new couplers present on Advantage, so improving the connectivity of a quantum annealer does not per se improve its performance.Comment: new experiments to test the conjecture about unused couplers (appendix B

    Hybrid Quantum Classical Simulations

    Full text link
    We report on two major hybrid applications of quantum computing, namely, the quantum approximate optimisation algorithm (QAOA) and the variational quantum eigensolver (VQE). Both are hybrid quantum classical algorithms as they require incremental communication between a classical central processing unit and a quantum processing unit to solve a problem. We find that the QAOA scales much better to larger problems than random guessing, but requires significant computational resources. In contrast, a coarsely discretised version of quantum annealing called approximate quantum annealing (AQA) can reach the same promising scaling behaviour using much less computational resources. For the VQE, we find reasonable results in approximating the ground state energy of the Heisenberg model when suitable choices of initial states and parameters are used. Our design and implementation of a general quasi-dynamical evolution further improves these results.Comment: This article is a book contribution. The book is freely available at http://hdl.handle.net/2128/3184

    Model-free inequality for data of Einstein-Podolsky-Rosen-Bohm experiments

    Full text link
    We present a new inequality constraining correlations obtained when performing Einstein-Podolsky-Rosen-Bohm experiments. The proof does not rely on mathematical models that are imagined to have produced the data and is therefore ``model-free''. The new inequality contains the model-free version of the well-known Bell-CHSH inequality as a special case. A violation of the latter implies that not all the data pairs in four data sets can be reshuffled to create quadruples. This conclusion provides a new perspective on the implications of the violation of Bell-type inequalities by experimental data.Comment: Extended version of Annals of Physics, Volume 453, 169314, 2023 (https://doi.org/10.1016/j.aop.2023.169314
    • …
    corecore