45,793 research outputs found

    Error-tolerant oblivious transfer in the noisy-storage model

    Get PDF
    The noisy-storage model of quantum cryptography allows for information-theoretically secure two-party computation based on the assumption that a cheating user has at most access to an imperfect, noisy quantum memory, whereas the honest users do not need a quantum memory at all. In general, the more noisy the quantum memory of the cheating user, the more secure the implementation of oblivious transfer, which is a primitive that allows universal secure two-party and multi-party computation. For experimental implementations of oblivious transfer, one has to consider that also the devices held by the honest users are lossy and noisy, and error correction needs to be applied to correct these trusted errors. The latter are expected to reduce the security of the protocol, since a cheating user may hide themselves in the trusted noise. Here we leverage entropic uncertainty relations to derive tight bounds on the security of oblivious transfer with a trusted and untrusted noise. In particular, we discuss noisy storage and bounded storage, with independent and correlated noise.Comment: v2: improved presentation, results added on bounded memory and correlated nois

    Commitment and Oblivious Transfer in the Bounded Storage Model with Errors

    Get PDF
    The bounded storage model restricts the memory of an adversary in a cryptographic protocol, rather than restricting its computational power, making information theoretically secure protocols feasible. We present the first protocols for commitment and oblivious transfer in the bounded storage model with errors, i.e., the model where the public random sources available to the two parties are not exactly the same, but instead are only required to have a small Hamming distance between themselves. Commitment and oblivious transfer protocols were known previously only for the error-free variant of the bounded storage model, which is harder to realize

    Limitations of Passive Protection of Quantum Information

    Full text link
    The ability to protect quantum information from the effect of noise is one of the major goals of quantum information processing. In this article, we study limitations on the asymptotic stability of quantum information stored in passive N-qubit systems. We consider the effect of small imperfections in the implementation of the protecting Hamiltonian in the form of perturbations or weak coupling to a ground state environment. We prove that, regardless of the protecting Hamiltonian, there exists a perturbed evolution that necessitates a final error correcting step when the state of the memory is read. Such an error correction step is shown to require a finite error threshold, the lack thereof being exemplified by the 3D compass model. We go on to present explicit weak Hamiltonian perturbations which destroy the logical information stored in the 2D toric code in a time O(log(N)).Comment: 17 pages and appendice

    A Tight High-Order Entropic Quantum Uncertainty Relation With Applications

    Get PDF
    We derive a new entropic quantum uncertainty relation involving min-entropy. The relation is tight and can be applied in various quantum-cryptographic settings. Protocols for quantum 1-out-of-2 Oblivious Transfer and quantum Bit Commitment are presented and the uncertainty relation is used to prove the security of these protocols in the bounded quantum-storage model according to new strong security definitions. As another application, we consider the realistic setting of Quantum Key Distribution (QKD) against quantum-memory-bounded eavesdroppers. The uncertainty relation allows to prove the security of QKD protocols in this setting while tolerating considerably higher error rates compared to the standard model with unbounded adversaries. For instance, for the six-state protocol with one-way communication, a bit-flip error rate of up to 17% can be tolerated (compared to 13% in the standard model). Our uncertainty relation also yields a lower bound on the min-entropy key uncertainty against known-plaintext attacks when quantum ciphers are composed. Previously, the key uncertainty of these ciphers was only known with respect to Shannon entropy.Comment: 21 pages; editorial changes, additional applicatio

    Codes for Asymmetric Limited-Magnitude Errors With Application to Multilevel Flash Memories

    Get PDF
    Several physical effects that limit the reliability and performance of multilevel flash memories induce errors that have low magnitudes and are dominantly asymmetric. This paper studies block codes for asymmetric limited-magnitude errors over q-ary channels. We propose code constructions and bounds for such channels when the number of errors is bounded by t and the error magnitudes are bounded by ℓ. The constructions utilize known codes for symmetric errors, over small alphabets, to protect large-alphabet symbols from asymmetric limited-magnitude errors. The encoding and decoding of these codes are performed over the small alphabet whose size depends only on the maximum error magnitude and is independent of the alphabet size of the outer code. Moreover, the size of the codes is shown to exceed the sizes of known codes (for related error models), and asymptotic rate-optimality results are proved. Extensions of the construction are proposed to accommodate variations on the error model and to include systematic codes as a benefit to practical implementation

    Reconstruction Codes for DNA Sequences with Uniform Tandem-Duplication Errors

    Full text link
    DNA as a data storage medium has several advantages, including far greater data density compared to electronic media. We propose that schemes for data storage in the DNA of living organisms may benefit from studying the reconstruction problem, which is applicable whenever multiple reads of noisy data are available. This strategy is uniquely suited to the medium, which inherently replicates stored data in multiple distinct ways, caused by mutations. We consider noise introduced solely by uniform tandem-duplication, and utilize the relation to constant-weight integer codes in the Manhattan metric. By bounding the intersection of the cross-polytope with hyperplanes, we prove the existence of reconstruction codes with greater capacity than known error-correcting codes, which we can determine analytically for any set of parameters.Comment: 11 pages, 2 figures, Latex; version accepted for publicatio

    Update-Efficiency and Local Repairability Limits for Capacity Approaching Codes

    Get PDF
    Motivated by distributed storage applications, we investigate the degree to which capacity achieving encodings can be efficiently updated when a single information bit changes, and the degree to which such encodings can be efficiently (i.e., locally) repaired when single encoded bit is lost. Specifically, we first develop conditions under which optimum error-correction and update-efficiency are possible, and establish that the number of encoded bits that must change in response to a change in a single information bit must scale logarithmically in the block-length of the code if we are to achieve any nontrivial rate with vanishing probability of error over the binary erasure or binary symmetric channels. Moreover, we show there exist capacity-achieving codes with this scaling. With respect to local repairability, we develop tight upper and lower bounds on the number of remaining encoded bits that are needed to recover a single lost bit of the encoding. In particular, we show that if the code-rate is Ï”\epsilon less than the capacity, then for optimal codes, the maximum number of codeword symbols required to recover one lost symbol must scale as log⁥1/Ï”\log1/\epsilon. Several variations on---and extensions of---these results are also developed.Comment: Accepted to appear in JSA
    • 

    corecore