898 research outputs found

    Quantum leakage detection using a model-independent dimension witness

    Full text link
    Users of quantum computers must be able to confirm they are indeed functioning as intended, even when the devices are remotely accessed. In particular, if the Hilbert space dimension of the components are not as advertised -- for instance if the qubits suffer leakage -- errors can ensue and protocols may be rendered insecure. We refine the method of delayed vectors, adapted from classical chaos theory to quantum systems, and apply it remotely on the IBMQ platform -- a quantum computer composed of transmon qubits. The method witnesses, in a model-independent fashion, dynamical signatures of higher-dimensional processes. We present evidence, under mild assumptions, that the IBMQ transmons suffer state leakage, with a pp value no larger than 5×1045{\times}10^{-4} under a single qubit operation. We also estimate the number of shots necessary for revealing leakage in a two-qubit system.Comment: 11 pages, 5 figure

    Coarse-graining in retrodictive quantum state tomography

    Full text link
    Quantum state tomography often operates in the highly idealised scenario of assuming perfect measurements. The errors implied by such an approach are entwined with other imperfections relating to the information processing protocol or application of interest. We consider the problem of retrodicting the quantum state of a system, existing prior to the application of random but known phase errors, allowing those errors to be separated and removed. The continuously random nature of the errors implies that there is only one click per measurement outcome -- a feature having a drastically adverse effect on data-processing times. We provide a thorough analysis of coarse-graining under various reconstruction algorithms, finding dramatic increases in speed for only modest sacrifices in fidelity

    Towards optimal experimental tests on the reality of the quantum state

    Get PDF
    The Barrett--Cavalcanti--Lal--Maroney (BCLM) argument stands as the most effective means of demonstrating the reality of the quantum state. Its advantages include being derived from very few assumptions, and a robustness to experimental error. Finding the best way to implement the argument experimentally is an open problem, however, and involves cleverly choosing sets of states and measurements. I show that techniques from convex optimisation theory can be leveraged to numerically search for these sets, which then form a recipe for experiments that allow for the strongest statements about the ontology of the wavefunction to be made. The optimisation approach presented is versatile, efficient and can take account of the finite errors present in any real experiment. I find significantly improved low-cardinality sets which are guaranteed partially optimal for a BCLM test in low Hilbert space dimension. I further show that mixed states can be more optimal than pure states

    Subtleties of witnessing quantum coherence in non-isolated systems

    Get PDF
    Identifying non-classicality unambiguously and inexpensively is a long-standing open challenge in physics. The No-Signalling-In-Time protocol was developed as an experimental test for macroscopic realism, and serves as a witness of quantum coherence in isolated quantum systems by comparing the quantum state to its completely dephased counterpart. We show that it provides a lower bound on a certain resource-theoretic coherence monotone. We go on to generalise the protocol to the case where the system of interest is coupled to an environment. Depending on the manner of the generalisation, the resulting witness either reports on system coherence alone, or on a disjunction of system coherence with either (i) the existence of non-classical system-environment correlations or (ii) non-negligible dynamics in the environment. These are distinct failure modes of the Born approximation in non-isolated systems.Comment: 16pp, 2 figs, 5 thms. v2: typos corrected, references added and small change to title to reflect that of published versio

    Quantum process tomography via completely positive and trace-preserving projection

    Get PDF
    We present an algorithm for projecting superoperators onto the set of completely positive, trace-preserving maps. When combined with gradient descent of a cost function, the procedure results in an algorithm for quantum process tomography: finding the quantum process that best fits a set of sufficient observations. We compare the performance of our algorithm to the diluted iterative algorithm as well as second-order solvers interfaced with the popular CVX package for MATLAB, and find it to be significantly faster and more accurate while guaranteeing a physical estimate.Comment: 13pp, 8 fig

    Quantum sensors based on weak-value amplification cannot overcome decoherence

    Get PDF
    Sensors that harness exclusively quantum phenomena (such as entanglement) can achieve superior performance compared to those employing only classical principles. Recently, a technique based on postselected, weakly-performed measurements has emerged as a method of overcoming technical noise in the detection and estimation of small interaction parameters, particularly in optical systems. The question of which other types of noise may be combatted remains open. We here analyze whether the effect can overcome decoherence in a typical field sensing scenario. Benchmarking a weak, postselected measurement strategy against a strong, direct strategy we conclude that no advantage is achievable, and that even a small amount of decoherence proves catastrophic to the weak-value amplification technique.Comment: Published version with improvements to presentation, including clarifying our understanding of technical noise and quantum nois
    corecore