5,110 research outputs found

    Two-Round Man-in-the-Middle Security from LPN

    Get PDF
    Secret-key authentication protocols have recently received a considerable amount of attention, and a long line of research has been devoted to devising efficient protocols with security based on the hardness of the learning-parity with noise (LPN) problem, with the goal of achieving low communication and round complexities, as well as highest possible security guarantees. In this paper, we construct 2-round authentication protocols that are secure against sequential man-in-the-middle (MIM) attacks with tight reductions to LPN, Field-LPN, or other problems. The best prior protocols had either loose reductions and required 3 rounds (Lyubashevsky and Masny, CRYPTO\u2713) or had a much larger key (Kiltz et al., EUROCRYPT\u2711 and Dodis et al., EUROCRYPT\u2712). Our constructions follow from a new generic deterministic and round-preserving transformation enhancing actively-secure protocols of a special form to be sequentially MIM-secure while only adding a limited amount of key material and computation

    Trusted-HB: a low-cost version of HB+ secure against Man-in-The-Middle attacks

    Full text link
    Since the introduction at Crypto'05 by Juels and Weis of the protocol HB+, a lightweight protocol secure against active attacks but only in a detection based-model, many works have tried to enhance its security. We propose here a new approach to achieve resistance against Man-in-The-Middle attacks. Our requirements - in terms of extra communications and hardware - are surprisingly low.Comment: submitted to IEEE Transactions on Information Theor

    A New Algorithm for Solving Ring-LPN with a Reducible Polynomial

    Full text link
    The LPN (Learning Parity with Noise) problem has recently proved to be of great importance in cryptology. A special and very useful case is the RING-LPN problem, which typically provides improved efficiency in the constructed cryptographic primitive. We present a new algorithm for solving the RING-LPN problem in the case when the polynomial used is reducible. It greatly outperforms previous algorithms for solving this problem. Using the algorithm, we can break the Lapin authentication protocol for the proposed instance using a reducible polynomial, in about 2^70 bit operations

    A Fault Analytic Method against HB+

    Get PDF
    The search for lightweight authentication protocols suitable for low-cost RFID tags constitutes an active and challenging research area. In this context, a family of protocols based on the LPN problem has been proposed: the so-called HB-family. Despite the rich literature regarding the cryptanalysis of these protocols, there are no published results about the impact of fault analysis over them. The purpose of this paper is to fill this gap by presenting a fault analytic method against a prominent member of the HB-family: HB+ protocol. We demonstrate that the fault analysis model can lead to a flexible and effective attack against HB-like protocols, posing a serious threat over them

    Cryptographic Primitives from Physical Variables

    Get PDF
    In this dissertation we explore a new paradigm emerging from the subtleties of cryptographic implementations and relating to theoretical aspects of cryptography. This new paradigm, namely physical variables (PVs), simply describes properties of physical objects designed to be identical but are not due to manufacturing variability. In the first part of this dissertation, we focus our attention on scenarios which require the unique identification of physical objects and we show how Gaussian PVs can be used to fulfill such a requirement. Using this framework we present and analyze a new technique for fingerprinting compact discs (CDs) using the manufacturing variability found in the length of the CDs\u27 lands and pits. Although the variability measured is on the order of 20 nm, the technique does not require the use of microscopes or any advanced equipment. Instead, the electrical signal produced by the photo-detector inside the CD reader will be sufficient to measure the desired variability. We thoroughly investigate the new technique by analyzing data collected from 100 identical CDs and show how to extract a unique fingerprint for each CD. In the second part, we shift our attention to physically parameterized functions (PPFs). Although all the constructions we provide are centered around delay-based physically unclonable functions (PUFs), we stress that the use of the term PUF could be misleading as most circuits labeled with the term PUF are in reality clonable on the protocol level. We argue that using a term like PPFs to describe functions parameterized by a PV is a more accurate description. Herein, we thoroughly analyze delay-PUFs and use a mathematical framework to construct two authentication protocols labeled PUF-HB and HB+PUF. Both these protocols merge the known HB authentication family with delay-based PUFs. The new protocols enjoy the security reduction put forth by the HB portion of the protocol and at the same time maintain a level of hardware security provided by the use of PUFs. We present a proof of concept implementation for HB+PUF which takes advantage of the PUF circuit in order to produce the random bits typically needed for an HB-based authentication scheme. The overall circuit is shown to occupy a few thousand gates. Finally, we present a new authentication protocol that uses 2-level PUF circuits and enables a security reduction which, unlike the previous two protocols, stems naturally from the usage of PVs

    Adaptive learning and cryptography

    Get PDF
    Significant links exist between cryptography and computational learning theory. Cryptographic functions are the usual method of demonstrating significant intractability results in computational learning theory as they can demonstrate that certain problems are hard in a representation independent sense. On the other hand, hard learning problems have been used to create efficient cryptographic protocols such as authentication schemes, pseudo-random permutations and functions, and even public key encryption schemes.;Learning theory / coding theory also impacts cryptography in that it enables cryptographic primitives to deal with the issues of noise or bias in their inputs. Several different constructions of fuzzy primitives exist, a fuzzy primitive being a primitive which functions correctly even in the presence of noisy , or non-uniform inputs. Some examples of these primitives include error-correcting blockciphers, fuzzy identity based cryptosystems, fuzzy extractors and fuzzy sketches. Error correcting blockciphers combine both encryption and error correction in a single function which results in increased efficiency. Fuzzy identity based encryption allows the decryption of any ciphertext that was encrypted under a close enough identity. Fuzzy extractors and sketches are methods of reliably (re)-producing a uniformly random secret key given an imperfectly reproducible string from a biased source, through a public string that is called the sketch .;While hard learning problems have many qualities which make them useful in constructing cryptographic protocols, such as their inherent error tolerance and simple algebraic structure, it is often difficult to utilize them to construct very secure protocols due to assumptions they make on the learning algorithm. Due to these assumptions, the resulting protocols often do not have security against various types of adaptive adversaries. to help deal with this issue, we further examine the inter-relationships between cryptography and learning theory by introducing the concept of adaptive learning . Adaptive learning is a rather weak form of learning in which the learner is not expected to closely approximate the concept function in its entirety, rather it is only expected to answer a query of the learner\u27s choice about the target. Adaptive learning allows for a much weaker learner than in the standard model, while maintaining the the positive properties of many learning problems in the standard model, a fact which we feel makes problems that are hard to adaptively learn more useful than standard model learning problems in the design of cryptographic protocols. We argue that learning parity with noise is hard to do adaptively and use that assumption to construct a related key secure, efficient MAC as well as an efficient authentication scheme. In addition we examine the security properties of fuzzy sketches and extractors and demonstrate how these properties can be combined by using our related key secure MAC. We go on to demonstrate that our extractor can allow a form of related-key hardening for protocols in that, by affecting how the key for a primitive is stored it renders that protocol immune to related key attacks

    HBNHB^N: An HB-like protocol secure against man-in-the-middle attacks

    Get PDF
    We construct a simple authentication protocol whose security is based solely on the problem of Learning Parity with Noise (LPN) which is secure against Man-in-the-Middle attacks. Our protocol is suitable for RFID devices, whose limited circuit size and power constraints rule out the use of more heavyweight operations such as modular exponentiation. The protocol is extremely simple: both parties compute a noisy bilinear function of their inputs. The proof, however, is quite technical, and we believe that some of our technical tools may be of independent interest

    Post-Quantum Provably-Secure Authentication and MAC from Mersenne Primes

    Get PDF
    This paper presents a novel, yet efficient secret-key authentication and MAC, which provide post-quantum security promise, whose security is reduced to the quantum-safe conjectured hardness of Mersenne Low Hamming Combination (MERS) assumption recently introduced by Aggarwal, Joux, Prakash, and Santha (CRYPTO 2018). Our protocols are very suitable to weak devices like smart card and RFID tags
    • …
    corecore