490 research outputs found

    Exact and Approximate Unitary 2-Designs: Constructions and Applications

    Get PDF
    We consider an extension of the concept of spherical t-designs to the unitary group in order to develop a unified framework for analyzing the resource requirements of randomized quantum algorithms. We show that certain protocols based on twirling require a unitary 2-design. We describe an efficient construction for an exact unitary 2-design based on the Clifford group, and then develop a method for generating an epsilon-approximate unitary 2-design that requires only O(n log(1/epsilon)) gates, where n is the number of qubits and epsilon is an appropriate measure of precision. These results lead to a protocol with exponential resource savings over existing experimental methods for estimating the characteristic fidelities of physical quantum processes

    Black holes as mirrors: quantum information in random subsystems

    Get PDF
    We study information retrieval from evaporating black holes, assuming that the internal dynamics of a black hole is unitary and rapidly mixing, and assuming that the retriever has unlimited control over the emitted Hawking radiation. If the evaporation of the black hole has already proceeded past the "half-way" point, where half of the initial entropy has been radiated away, then additional quantum information deposited in the black hole is revealed in the Hawking radiation very rapidly. Information deposited prior to the half-way point remains concealed until the half-way point, and then emerges quickly. These conclusions hold because typical local quantum circuits are efficient encoders for quantum error-correcting codes that nearly achieve the capacity of the quantum erasure channel. Our estimate of a black hole's information retention time, based on speculative dynamical assumptions, is just barely compatible with the black hole complementarity hypothesis.Comment: 18 pages, 2 figures. (v2): discussion of decoding complexity clarifie

    Randomized benchmarking of single and multi-qubit control in liquid-state NMR quantum information processing

    Full text link
    Being able to quantify the level of coherent control in a proposed device implementing a quantum information processor (QIP) is an important task for both comparing different devices and assessing a device's prospects with regards to achieving fault-tolerant quantum control. We implement in a liquid-state nuclear magnetic resonance QIP the randomized benchmarking protocol presented by Knill et al (PRA 77: 012307 (2008)). We report an error per randomized π2\frac{\pi}{2} pulse of 1.3±0.1×1041.3 \pm 0.1 \times 10^{-4} with a single qubit QIP and show an experimentally relevant error model where the randomized benchmarking gives a signature fidelity decay which is not possible to interpret as a single error per gate. We explore and experimentally investigate multi-qubit extensions of this protocol and report an average error rate for one and two qubit gates of 4.7±0.3×1034.7 \pm 0.3 \times 10^{-3} for a three qubit QIP. We estimate that these error rates are still not decoherence limited and thus can be improved with modifications to the control hardware and software.Comment: 10 pages, 6 figures, submitted versio

    Quantum non-malleability and authentication

    Get PDF
    In encryption, non-malleability is a highly desirable property: it ensures that adversaries cannot manipulate the plaintext by acting on the ciphertext. Ambainis, Bouda and Winter gave a definition of non-malleability for the encryption of quantum data. In this work, we show that this definition is too weak, as it allows adversaries to "inject" plaintexts of their choice into the ciphertext. We give a new definition of quantum non-malleability which resolves this problem. Our definition is expressed in terms of entropic quantities, considers stronger adversaries, and does not assume secrecy. Rather, we prove that quantum non-malleability implies secrecy; this is in stark contrast to the classical setting, where the two properties are completely independent. For unitary schemes, our notion of non-malleability is equivalent to encryption with a two-design (and hence also to the definition of Ambainis et al.). Our techniques also yield new results regarding the closely-related task of quantum authentication. We show that "total authentication" (a notion recently proposed by Garg, Yuen and Zhandry) can be satisfied with two-designs, a significant improvement over the eight-design construction of Garg et al. We also show that, under a mild adaptation of the rejection procedure, both total authentication and our notion of non-malleability yield quantum authentication as defined by Dupuis, Nielsen and Salvail.Comment: 20+13 pages, one figure. v2: published version plus extra material. v3: references added and update

    Underreporting of meningococcal disease incidence in the Netherlands: results from a capture-recapture analysis based on three registration sources with correction for false positive diagnoses.

    Get PDF
    In order to come to a reliable evaluation of the effectiveness of the chosen vaccination policy regarding meningococcal disease, the completeness of registrations on meningococcal disease in the Netherlands was estimated with the capture-recapture method. Data over 1993-1998 were collected from (A) mandatory notifications (n = 2926); (B) hospital registration (n = 3968); (C) laboratory surveillance (n = 3484). As the standard capture-recapture method does not take into account false positive diagnoses, we developed a model to adjust for the lack of specificity of our sources. We estimated that 1363 cases were not registered in any of the three sources in the period of study. The completeness of the three sources was therefore estimated at 49% for source A, 67% for source B and 58% for source C. After adjustment for false positive diagnoses, the completeness of source A, B, and C was estimated as 52%, 70% and 62%, respectively. The capture-recapture methods offer an attractive approach to estimate the completeness of surveillance sources and hence contribute to a more accurate estimate of the disease burden under study. However, the method does not account for higher-order interactions or presence of false positive diagnoses. Being aware of these limitations, the capture-recapture method still elucidates the (in)completeness of sources and gives a rough estimate of this (in)completeness. This makes a more accurate monitoring of disease incidence possible and hence attributes to a more reliable foundation for the design and evaluation of health interventions such as vaccination programs

    Quantum authentication with key recycling

    Get PDF
    We show that a family of quantum authentication protocols introduced in [Barnum et al., FOCS 2002] can be used to construct a secure quantum channel and additionally recycle all of the secret key if the message is successfully authenticated, and recycle part of the key if tampering is detected. We give a full security proof that constructs the secure channel given only insecure noisy channels and a shared secret key. We also prove that the number of recycled key bits is optimal for this family of protocols, i.e., there exists an adversarial strategy to obtain all non-recycled bits. Previous works recycled less key and only gave partial security proofs, since they did not consider all possible distinguishers (environments) that may be used to distinguish the real setting from the ideal secure quantum channel and secret key resource.Comment: 38+17 pages, 13 figures. v2: constructed ideal secure channel and secret key resource have been slightly redefined; also added a proof in the appendix for quantum authentication without key recycling that has better parameters and only requires weak purity testing code

    Europe under Pressure

    Get PDF
    The past years were characterized by a massive influx of migrants crossing the Union’s external borders seeking asylum. Illegal migration, exploitation of social welfare systems, foreign infiltration and the instrumentalization of religion condensed in terror attacks determine today’s changed attitude towards foreigners, refugees and migrants and therefore strongly impact the current European political agenda. Angelika C. Dankert describes the development of the EU and provides information on events that led to the creation and the spill-over of the Arab Spring. Roots and origin of Jihadist ideology as well as goals of religiously motivated terrorism are illustrated and European standards on morals and values are critically questioned. Through investigation of current matters in the field of law, security and interculturality, this book reveals the biggest geopolitical challenge of the 21st century

    Tight informationally complete quantum measurements

    Get PDF
    We introduce a class of informationally complete positive-operator-valued measures which are, in analogy with a tight frame, "as close as possible" to orthonormal bases for the space of quantum states. These measures are distinguished by an exceptionally simple state-reconstruction formula which allows "painless" quantum state tomography. Complete sets of mutually unbiased bases and symmetric informationally complete positive-operator-valued measures are both members of this class, the latter being the unique minimal rank-one members. Recast as ensembles of pure quantum states, the rank-one members are in fact equivalent to weighted 2-designs in complex projective space. These measures are shown to be optimal for quantum cloning and linear quantum state tomography.Comment: 20 pages. Final versio

    Efficient and feasible state tomography of quantum many-body systems

    Full text link
    We present a novel method to perform quantum state tomography for many-particle systems which are particularly suitable for estimating states in lattice systems such as of ultra-cold atoms in optical lattices. We show that the need for measuring a tomographically complete set of observables can be overcome by letting the state evolve under some suitably chosen random circuits followed by the measurement of a single observable. We generalize known results about the approximation of unitary 2-designs, i.e., certain classes of random unitary matrices, by random quantum circuits and connect our findings to the theory of quantum compressed sensing. We show that for ultra-cold atoms in optical lattices established techniques like optical super-lattices, laser speckles, and time-of-flight measurements are sufficient to perform fully certified, assumption-free tomography. Combining our approach with tensor network methods - in particular the theory of matrix-product states - we identify situations where the effort of reconstruction is even constant in the number of lattice sites, allowing in principle to perform tomography on large-scale systems readily available in present experiments.Comment: 10 pages, 3 figures, minor corrections, discussion added, emphasizing that no single-site addressing is needed at any stage of the scheme when implemented in optical lattice system

    Decoupling with unitary approximate two-designs

    Full text link
    Consider a bipartite system, of which one subsystem, A, undergoes a physical evolution separated from the other subsystem, R. One may ask under which conditions this evolution destroys all initial correlations between the subsystems A and R, i.e. decouples the subsystems. A quantitative answer to this question is provided by decoupling theorems, which have been developed recently in the area of quantum information theory. This paper builds on preceding work, which shows that decoupling is achieved if the evolution on A consists of a typical unitary, chosen with respect to the Haar measure, followed by a process that adds sufficient decoherence. Here, we prove a generalized decoupling theorem for the case where the unitary is chosen from an approximate two-design. A main implication of this result is that decoupling is physical, in the sense that it occurs already for short sequences of random two-body interactions, which can be modeled as efficient circuits. Our decoupling result is independent of the dimension of the R system, which shows that approximate 2-designs are appropriate for decoupling even if the dimension of this system is large.Comment: Published versio
    corecore