462 research outputs found

    On the Complexity of Decomposable Randomized Encodings, Or: How Friendly Can a Garbling-Friendly PRF Be?

    Get PDF

    Post-quantum cryptography

    Get PDF
    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.</p

    Lattice-Inspired Broadcast Encryption and Succinct Ciphertext-Policy ABE

    Get PDF
    Broadcast encryption remains one of the few remaining central cryptographic primitives that are not yet known to be achievable under a standard cryptographic assumption (excluding obfuscation-based constructions, see below). Furthermore, prior to this work, there were no known direct candidates for post-quantum-secure broadcast encryption. We propose a candidate ciphertext-policy attribute-based encryption (CP-ABE) scheme for circuits, where the ciphertext size depends only on the depth of the policy circuit (and not its size). This, in particular, gives us a Broadcast Encryption (BE) scheme where the size of the keys and ciphertexts have a poly-logarithmic dependence on the number of users. This goal was previously only known to be achievable assuming ideal multilinear maps (Boneh, Waters and Zhandry, Crypto 2014) or indistinguishability obfuscation (Boneh and Zhandry, Crypto 2014) and in a concurrent work from generic bilinear groups and the learning with errors (LWE) assumption (Agrawal and Yamada, Eurocrypt 2020). Our construction relies on techniques from lattice-based (and in particular LWE-based) cryptography. We analyze some attempts at cryptanalysis, but we are unable to provide a security proof

    Cryptographic Assumptions: A Position Paper

    Get PDF
    The mission of theoretical cryptography is to define and construct provably secure cryptographic protocols and schemes. Without proofs of security, cryptographic constructs offer no guarantees whatsoever and no basis for evaluation and comparison. As most security proofs necessarily come in the form of a reduction between the security claim and an intractability assumption, such proofs are ultimately only as good as the assumptions they are based on. Thus, the complexity implications of every assumption we utilize should be of significant substance, and serve as the yard stick for the value of our proposals. Lately, the field of cryptography has seen a sharp increase in the number of new assumptions that are often complex to define and difficult to interpret. At times, these assumptions are hard to untangle from the constructions which utilize them. We believe that the lack of standards of what is accepted as a reasonable cryptographic assumption can be harmful to the credibility of our field. Therefore, there is a great need for {\em measures} according to which we classify and compare assumptions, as to which are {\it safe} and which are not. In this paper, we propose such a classification and review recently suggested assumptions in this light. This follows the footsteps of Naor (Crypto 2003). Our governing principle is relying on hardness assumptions that are independent of the cryptographic constructions

    Algebraic Restriction Codes and Their Applications

    Get PDF
    Consider the following problem: You have a device that is supposed to compute a linear combination of its inputs, which are taken from some finite field. However, the device may be faulty and compute arbitrary functions of its inputs. Is it possible to encode the inputs in such a way that only linear functions can be evaluated over the encodings? I.e., learning an arbitrary function of the encodings will not reveal more information about the inputs than a linear combination. In this work, we introduce the notion of algebraic restriction codes (AR codes), which constrain adversaries who might compute any function to computing a linear function. Our main result is an information-theoretic construction AR codes that restrict any class of function with a bounded number of output bits to linear functions. Our construction relies on a seed which is not provided to the adversary. While interesting and natural on its own, we show an application of this notion in cryptography. In particular, we show that AR codes lead to the first construction of rate-1 oblivious transfer with statistical sender security from the Decisional Diffie-Hellman assumption, and the first-ever construction that makes black-box use of cryptography. Previously, such protocols were known only from the LWE assumption, using non-black-box cryptographic techniques. We expect our new notion of AR codes to find further applications, e.g., in the context of non-malleability, in the future

    A Secure Oblivious Transfer Protocol from Indistinguishing Obfuscation

    Get PDF
    We proposed a new secure oblivious transfer protocol from indistinguishability obfuscation in this paper. Our main technical tool is the candidate indistinguishability obfuscation introduced in [1] and a dual-mode cryptosystem proposed in [2]. Following their steps, we presents a new k-out-of-l oblivious transfer protocol, its realization from DDH is described in this paper, in which we combined indistinguishability obfuscation with the dual-mode cryptosystem. The security of our scheme mainly relies on the indistinguishability of the obf-branches ( corresponding to the two modes in dual-mode model). Our paper explores a new way for the application of indistinguishability obfuscation

    Optimal Forgeries Against Polynomial-Based MACs and GCM

    Get PDF
    Polynomial-based authentication algorithms, such as GCM and Poly1305, have seen widespread adoption in practice. Due to their importance, a significant amount of attention has been given to understanding and improving both proofs and attacks against such schemes. At EUROCRYPT 2005, Bernstein published the best known analysis of the schemes when instantiated with PRPs, thereby establishing the most lenient limits on the amount of data the schemes can process per key. A long line of work, initiated by Handschuh and Preneel at CRYPTO 2008, finds the best known attacks, advancing our understanding of the fragility of the schemes. Yet surprisingly, no known attacks perform as well as the predicted worst-case attacks allowed by Bernstein\u27s analysis, nor has there been any advancement in proofs improving Bernstein\u27s bounds, and the gap between attacks and analysis is significant. We settle the issue by finding a novel attack against polynomial-based authentication algorithms using PRPs, and combine it with new analysis, to show that Bernstein\u27s bound, and our attacks, are optimal
    • …
    corecore