335 research outputs found

    Leftover Hashing Against Quantum Side Information

    Full text link
    The Leftover Hash Lemma states that the output of a two-universal hash function applied to an input with sufficiently high entropy is almost uniformly random. In its standard formulation, the lemma refers to a notion of randomness that is (usually implicitly) defined with respect to classical side information. Here, we prove a (strictly) more general version of the Leftover Hash Lemma that is valid even if side information is represented by the state of a quantum system. Furthermore, our result applies to arbitrary delta-almost two-universal families of hash functions. The generalized Leftover Hash Lemma has applications in cryptography, e.g., for key agreement in the presence of an adversary who is not restricted to classical information processing

    Quantum Hashing for Finite Abelian Groups

    Get PDF
    We propose a generalization of the quantum hashing technique based on the notion of the small-bias sets. These sets have proved useful in different areas of computer science, and here their properties give an optimal construction for succinct quantum presentation of elements of any finite abelian group, which can be used in various computational and cryptographic scenarios. The known quantum fingerprinting schemas turn out to be the special cases of the proposed quantum hashing for the corresponding abelian group

    Postprocessing for quantum random number generators: entropy evaluation and randomness extraction

    Full text link
    Quantum random-number generators (QRNGs) can offer a means to generate information-theoretically provable random numbers, in principle. In practice, unfortunately, the quantum randomness is inevitably mixed with classical randomness due to classical noises. To distill this quantum randomness, one needs to quantify the randomness of the source and apply a randomness extractor. Here, we propose a generic framework for evaluating quantum randomness of real-life QRNGs by min-entropy, and apply it to two different existing quantum random-number systems in the literature. Moreover, we provide a guideline of QRNG data postprocessing for which we implement two information-theoretically provable randomness extractors: Toeplitz-hashing extractor and Trevisan's extractor.Comment: 13 pages, 2 figure

    Unconditional security from noisy quantum storage

    Full text link
    We consider the implementation of two-party cryptographic primitives based on the sole assumption that no large-scale reliable quantum storage is available to the cheating party. We construct novel protocols for oblivious transfer and bit commitment, and prove that realistic noise levels provide security even against the most general attack. Such unconditional results were previously only known in the so-called bounded-storage model which is a special case of our setting. Our protocols can be implemented with present-day hardware used for quantum key distribution. In particular, no quantum storage is required for the honest parties.Comment: 25 pages (IEEE two column), 13 figures, v4: published version (to appear in IEEE Transactions on Information Theory), including bit wise min-entropy sampling. however, for experimental purposes block sampling can be much more convenient, please see v3 arxiv version if needed. See arXiv:0911.2302 for a companion paper addressing aspects of a practical implementation using block samplin

    On Quantum Fingerprinting and Quantum Cryptographic Hashing

    Get PDF
    Fingerprinting and cryptographic hashing have quite different usages in computer science, but have similar properties. Interpretation of their properties is determined by the area of their usage: fingerprinting methods are methods for constructing efficient randomized and quantum algorithms for computational problems, whereas hashing methods are one of the central cryptographical primitives. Fingerprinting and hashing methods are being developed from the mid of the previous century, whereas quantum fingerprinting and quantum hashing have a short history. In this chapter, we investigate quantum fingerprinting and quantum hashing. We present computational aspects of quantum fingerprinting and quantum hashing and discuss cryptographical properties of quantum hashing

    Sampling of min-entropy relative to quantum knowledge

    Full text link
    Let X_1, ..., X_n be a sequence of n classical random variables and consider a sample of r positions selected at random. Then, except with (exponentially in r) small probability, the min-entropy of the sample is not smaller than, roughly, a fraction r/n of the total min-entropy of all positions X_1, ..., X_n, which is optimal. Here, we show that this statement, originally proven by Vadhan [LNCS, vol. 2729, Springer, 2003] for the purely classical case, is still true if the min-entropy is measured relative to a quantum system. Because min-entropy quantifies the amount of randomness that can be extracted from a given random variable, our result can be used to prove the soundness of locally computable extractors in a context where side information might be quantum-mechanical. In particular, it implies that key agreement in the bounded-storage model (using a standard sample-and-hash protocol) is fully secure against quantum adversaries, thus solving a long-standing open problem.Comment: 48 pages, late

    New security notions and feasibility results for authentication of quantum data

    Get PDF
    We give a new class of security definitions for authentication in the quantum setting. These definitions capture and strengthen existing definitions of security against quantum adversaries for both classical message authentication codes (MACs) and well as full quantum state authentication schemes. The main feature of our definitions is that they precisely characterize the effective behavior of any adversary when the authentication protocol accepts, including correlations with the key. Our definitions readily yield a host of desirable properties and interesting consequences; for example, our security definition for full quantum state authentication implies that the entire secret key can be re-used if the authentication protocol succeeds. Next, we present several protocols satisfying our security definitions. We show that the classical Wegman-Carter authentication scheme with 3-universal hashing is secure against superposition attacks, as well as adversaries with quantum side information. We then present conceptually simple constructions of full quantum state authentication. Finally, we prove a lifting theorem which shows that, as long as a protocol can securely authenticate the maximally entangled state, it can securely authenticate any state, even those that are entangled with the adversary. Thus, this shows that protocols satisfying a fairly weak form of authentication security automatically satisfy a stronger notion of security (in particular, the definition of Dupuis, et al (2012)).Comment: 50 pages, QCrypt 2016 - 6th International Conference on Quantum Cryptography, added a new lifting theorem that shows equivalence between a weak form of authentication security and a stronger notion that considers side informatio

    Algebra in Computational Complexity

    Get PDF
    At its core, much of Computational Complexity is concerned with combinatorial objects and structures. But it has often proven true that the best way to prove things about these combinatorial objects is by establishing a connection to a more well-behaved algebraic setting. Indeed, many of the deepest and most powerful results in Computational Complexity rely on algebraic proof techniques. The Razborov-Smolensky polynomial-approximation method for proving constant-depth circuit lower bounds, the PCP characterization of NP, and the Agrawal-Kayal-Saxena polynomial-time primality test are some of the most prominent examples. The algebraic theme continues in some of the most exciting recent progress in computational complexity. There have been significant recent advances in algebraic circuit lower bounds, and the so-called "chasm at depth 4" suggests that the restricted models now being considered are not so far from ones that would lead to a general result. There have been similar successes concerning the related problems of polynomial identity testing and circuit reconstruction in the algebraic model, and these are tied to central questions regarding the power of randomness in computation. Representation theory has emerged as an important tool in three separate lines of work: the "Geometric Complexity Theory" approach to P vs. NP and circuit lower bounds, the effort to resolve the complexity of matrix multiplication, and a framework for constructing locally testable codes. Coding theory has seen several algebraic innovations in recent years, including multiplicity codes, and new lower bounds. This seminar brought together researchers who are using a diverse array of algebraic methods in a variety of settings. It plays an important role in educating a diverse community about the latest new techniques, spurring further progress
    corecore