22 research outputs found

    When quantum tomography goes wrong: drift of quantum sources and other errors

    Get PDF
    The principle behind quantum tomography is that a large set of observations—many samples from a 'quorum' of distinct observables—can all be explained satisfactorily as measurements on a single underlying quantum state or process. Unfortunately, this principle may not hold. When it fails, any standard tomographic estimate should be viewed skeptically. Here we propose a simple way to test for this kind of failure using the Akaike information criterion. We point out that the application of this criterion in a quantum context, while still powerful, is not as straightforward as it is in classical physics. This is especially the case when future observables differ from those constituting the quorum

    Entanglement verification with finite data

    Get PDF
    Suppose an experimentalist wishes to verify that his apparatus produces entangled quantum states. A finite amount of data cannot conclusively demonstrate entanglement, so drawing conclusions from real-world data requires statistical reasoning. We propose a reliable method to quantify the weight of evidence for (or against) entanglement, based on a likelihood ratio test. Our method is universal in that it can be applied to any sort of measurements. We demonstrate the method by applying it to two simulated experiments on two qubits. The first measures a single entanglement witness, while the second performs a tomographically complete measurement.Comment: 4 pages, 3 pretty picture

    Quantum Darwinism in quantum Brownian motion: the vacuum as a witness

    Get PDF
    We study quantum Darwinism -- the redundant recording of information about a decohering system by its environment -- in zero-temperature quantum Brownian motion. An initially nonlocal quantum state leaves a record whose redundancy increases rapidly with its spatial extent. Significant delocalization (e.g., a Schroedinger's Cat state) causes high redundancy: many observers can measure the system's position without perturbing it. This explains the objective (i.e. classical) existence of einselected, decoherence-resistant pointer states of macroscopic objects.Comment: 5 page

    Optimal, reliable estimation of quantum states

    Get PDF
    Accurately inferring the state of a quantum device from the results of measurements is a crucial task in building quantum information processing hardware. The predominant state estimation procedure, maximum likelihood estimation (MLE), generally reports an estimate with zero eigenvalues. These cannot be justified. Furthermore, the MLE estimate is incompatible with error bars, so conclusions drawn from it are suspect. I propose an alternative procedure, Bayesian mean estimation (BME). BME never yields zero eigenvalues, its eigenvalues provide a bound on their own uncertainties, and it is the most accurate procedure possible. I show how to implement BME numerically, and how to obtain natural error bars that are compatible with the estimate. Finally, I briefly discuss the differences between Bayesian and frequentist estimation techniques.Comment: RevTeX; 14 pages, 2 embedded figures. Comments enthusiastically welcomed

    The structure of preserved information in quantum processes

    Get PDF
    We introduce a general operational characterization of information-preserving structures (IPS) -- encompassing noiseless subsystems, decoherence-free subspaces, pointer bases, and error-correcting codes -- by demonstrating that they are isometric to fixed points of unital quantum processes. Using this, we show that every IPS is a matrix algebra. We further establish a structure theorem for the fixed states and observables of an arbitrary process, which unifies the Schrodinger and Heisenberg pictures, places restrictions on physically allowed kinds of information, and provides an efficient algorithm for finding all noiseless and unitarily noiseless subsystems of the process

    Exponential speed-up with a single bit of quantum information: Testing the quantum butterfly effect

    Full text link
    We present an efficient quantum algorithm to measure the average fidelity decay of a quantum map under perturbation using a single bit of quantum information. Our algorithm scales only as the complexity of the map under investigation, so for those maps admitting an efficient gate decomposition, it provides an exponential speed up over known classical procedures. Fidelity decay is important in the study of complex dynamical systems, where it is conjectured to be a signature of quantum chaos. Our result also illustrates the role of chaos in the process of decoherence.Comment: 4 pages, 2 eps figure

    Two-Qubit Gate Set Tomography with Fewer Circuits

    Full text link
    Gate set tomography (GST) is a self-consistent and highly accurate method for the tomographic reconstruction of a quantum information processor's quantum logic operations, including gates, state preparations, and measurements. However, GST's experimental cost grows exponentially with qubit number. For characterizing even just two qubits, a standard GST experiment may have tens of thousands of circuits, making it prohibitively expensive for platforms. We show that, because GST experiments are massively overcomplete, many circuits can be discarded. This dramatically reduces GST's experimental cost while still maintaining GST's Heisenberg-like scaling in accuracy. We show how to exploit the structure of GST circuits to determine which ones are superfluous. We confirm the efficacy of the resulting experiment designs both through numerical simulations and via the Fisher information for said designs. We also explore the impact of these techniques on the prospects of three-qubit GST.Comment: 46 pages, 13 figures. V2: Minor edits to acknowledgment
    corecore