297 research outputs found

    Post-Quantum Security of Key Encapsulation Mechanism against CCA Attacks with a Single Decapsulation Query

    Get PDF
    Recently, in post-quantum cryptography migration, it has been shown that an IND-1-CCA-secure key encapsulation mechanism (KEM) is required for replacing an ephemeral Diffie-Hellman (DH) in widely-used protocols, e.g., TLS, Signal, and Noise. IND-1-CCA security is a notion similar to the traditional IND-CCA security except that the adversary is restricted to one single decapsulation query. At EUROCRYPT 2022, based on CPA-secure public-key encryption (PKE), Huguenin-Dumittan and Vaudenay presented two IND-1-CCA KEM constructions called TCHT_{CH} and THT_H, which are much more efficient than the widely-used IND-CCA-secure Fujisaki-Okamoto (FO) KEMs. The security of TCHT_{CH} was proved in both random oracle model (ROM) and quantum random oracle model (QROM). However, the QROM proof of TCHT_{CH} relies on an additional ciphertext expansion. While, the security of THT_H was only proved in the ROM, and the QROM proof is left open. In this paper, we prove the security of THT_H and TRHT_{RH} (an implicit variant of THT_H) in both ROM and QROM with much tighter reductions than Huguenin-Dumittan and Vaudenay\u27s work. In particular, our QROM proof will not lead to ciphertext expansion. Moreover, for TRHT_{RH}, THT_H and TCHT_{CH}, we also show that a O(1/q)O(1/q) (O(1/q2)O(1/q^2), resp.) reduction loss is unavoidable in the ROM (QROM, resp.), and thus claim that our ROM proof is optimal in tightness. Finally, we make a comprehensive comparison among the relative strengths of IND-1-CCA and IND-CCA in the ROM and QROM

    Research Philosophy of Modern Cryptography

    Get PDF
    Proposing novel cryptography schemes (e.g., encryption, signatures, and protocols) is one of the main research goals in modern cryptography. In this paper, based on more than 800 research papers since 1976 that we have surveyed, we introduce the research philosophy of cryptography behind these papers. We use ``benefits and ``novelty as the keywords to introduce the research philosophy of proposing new schemes, assuming that there is already one scheme proposed for a cryptography notion. Next, we introduce how benefits were explored in the literature and we have categorized the methodology into 3 ways for benefits, 6 types of benefits, and 17 benefit areas. As examples, we introduce 40 research strategies within these benefit areas that were invented in the literature. The introduced research strategies have covered most cryptography schemes published in top-tier cryptography conferences

    Efficient NIZKs for Algebraic Sets

    Get PDF
    Significantly extending the framework of (Couteau and Hartmann, Crypto 2020), we propose a general methodology to construct NIZKs for showing that an encrypted vector χ⃗\vec{\chi} belongs to an algebraic set, i.e., is in the zero locus of an ideal I\mathscr{I} of a polynomial ring. In the case where I\mathscr{I} is principal, i.e., generated by a single polynomial FF, we first construct a matrix that is a ``quasideterminantal representation\u27\u27 of FF and then a NIZK argument to show that F(χ⃗)=0F (\vec{\chi}) = 0. This leads to compact NIZKs for general computational structures, such as polynomial-size algebraic branching programs. We extend the framework to the case where \IDEAL is non-principal, obtaining efficient NIZKs for R1CS, arithmetic constraint satisfaction systems, and thus for NP\mathsf{NP}. As an independent result, we explicitly describe the corresponding language of ciphertexts as an algebraic language, with smaller parameters than in previous constructions that were based on the disjunction of algebraic languages. This results in an efficient GL-SPHF for algebraic branching programs

    On Efficient Zero-Knowledge Arguments

    Get PDF

    On the (In)Security of the BUFF Transform

    Get PDF
    The BUFF transform is a generic transformation for digital signature schemes, with the purpose of obtaining additional security properties beyond standard unforgeability, e.g., exclusive ownership and non-resignability. In the call for additional post-quantum signatures, these were explicitly mentioned by the NIST as ``additional desirable security properties\u27\u27, and some of the submissions indeed refer to the BUFF transform with the purpose of achieving them, while some other submissions follow the design of the BUFF transform without mentioning it explicitly. In this work, we show the following negative results regarding the non-resignability property in general, and the BUFF transform in particular. In the plain model, we observe by means of a simple attack that any signature scheme for which the message has a high entropy given the signature does not satisfy the non-resignability property (while non-resignability is trivially not satisfied if the message can be efficiently computed from its signature). Given that the BUFF transform has high entropy in the message given the signature, it follows that the BUFF transform does not achieve non-resignability whenever the random oracle is instantiated with a hash function, no matter what hash function. When considering the random oracle model (ROM), the matter becomes slightly more delicate since prior works did not rigorously define the non-resignability property in the ROM. For the natural extension of the definition to the ROM, we observe that our impossibility result still holds, despite there having been positive claims about the non-resignability of the BUFF transform in the ROM. Indeed, prior claims of the non-resignability of the BUFF transform rely on faulty argumentation. On the positive side, we prove that a salted version of the BUFF transform satisfies a slightly weaker variant of non-resignability in the ROM, covering both classical and quantum attacks, if the entropy requirement in the (weakened) definition of non-resignability is statistical; for the computational variant, we show yet another negative result

    Decryption Failure Attacks on Post-Quantum Cryptography

    Get PDF
    This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results

    Aggregate Signatures with Versatile Randomization and Issuer-Hiding Multi-Authority Anonymous Credentials

    Get PDF
    Anonymous credentials (AC) have emerged as a promising privacy-preserving solu- tion for user-centric identity management. They allow users to authenticate in an anonymous and unlinkable way such that only required information (i.e., attributes) from their credentials are re- vealed. With the increasing push towards decentralized systems and identity, e.g., self-sovereign identity (SSI) and the concept of verifiable credentials, this also necessitates the need for suit- able AC systems. For instance, when relying on existing AC systems, obtaining credentials from different issuers requires the presentation of independent credentials, which can become cum- bersome. Consequently, it is desirable for AC systems to support the so-called multi-authority (MA) feature. It allows a compact and efficient showing of multiple credentials from different is- suers. Another important property is called issuer hiding (IH). This means that showing a set of credentials is not revealed which issuer has issued which credentials but only whether a verifier- defined policy on the acceptable set of issuers is satisfied. This issue becomes particularly acute in the context of MA, where a user could be uniquely identified by the combination of issuers in their showing. Unfortunately, there are no AC schemes that satisfy both these properties simul- taneously. To close this gap, we introduce the concept of Issuer-Hiding Multi-Authority Anonymous Cre- dentials (IhMA). Our proposed solution involves the development of two new signature primi- tives with versatile randomization features which are independent of interest: 1) Aggregate Sig- natures with Randomizable Tags and Public Keys (AtoSa) and 2) Aggregate Mercurial Signatures (ATMS), which extend the functionality of AtoSa to additionally support the randomization of messages and yield the first instance of an aggregate (equivalence-class) structure-preserving sig- nature. These primitives can be elegantly used to obtain IhMA with different trade-offs but have applications beyond. We formalize all notations and provide rigorous security definitions for our proposed primi- tives. We present provably secure and efficient instantiations of the two primitives as well as corresponding IhMA systems. Finally, we provide benchmarks based on an implementation to demonstrate the practical efficiency of our construction

    Privacy-Preserving Blueprints

    Get PDF
    In a world where everyone uses anonymous credentials for all access control needs, it is impossible to trace wrongdoers, by design. This makes legitimate controls, such as tracing illicit trade and terror suspects, impossible to carry out. Here, we propose a privacy-preserving blueprint capability that allows an auditor to publish an encoding pkApk_A of the function f(x,⋅)f(x,\cdot) for a publicly known function ff and a secret input xx. For example, xx may be a secret watchlist, and f(x,y)f(x,y) may return yy if y∈xy\in x. On input her data yy and the auditor\u27s pkApk_A, a user can compute an escrow ZZ such that anyone can verify that ZZ was computed correctly from the user\u27s credential attributes, and moreover, the auditor can recover f(x,y)f(x,y) from ZZ. Our contributions are: * We define secure ff-blueprint systems; our definition is designed to provide a modular extension to anonymous credential systems. * We show that secure ff-blueprint systems can be constructed for all functions ff from fully homomorphic encryption and NIZK proof systems, or from non-interactive secure computation and NIZK. These results are of theoretical interest but is not efficient enough for practical use. * We realize an optimal blueprint system under the DDH assumption in the random-oracle model for the watchlist function

    Imbalanced Cryptographic Protocols

    Get PDF
    Efficiency is paramount when designing cryptographic protocols, heavy mathematical operations often increase computation time, even for modern computers. Moreover, they produce large amounts of data that need to be sent through (often limited) network connections. Therefore, many research efforts are invested in improving efficiency, sometimes leading to imbalanced cryptographic protocols. We define three types of imbalanced protocols, computationally, communicationally, and functionally imbalanced protocols. Computationally imbalanced cryptographic protocols appear when optimizing a protocol for one party having significantly more computing power. In communicationally imbalanced cryptographic protocols the messages mainly flow from one party to the others. Finally, in functionally imbalanced cryptographic protocols the functional requirements of one party strongly differ from the other parties. We start our study by looking into laconic cryptography, which fits both the computational and communicational category. The emerging area of laconic cryptography involves the design of two-party protocols involving a sender and a receiver, where the receiver’s input is large. The key efficiency requirement is that the protocol communication complexity must be independent of the receiver’s input size. We show a new way to build laconic OT based on the new notion of Set Membership Encryption (SME) – a new member in the area of laconic cryptography. SME allows a sender to encrypt to one recipient from a universe of receivers, while using a small digest from a large subset of receivers. A recipient is only able to decrypt the message if and only if it is part of the large subset. As another example of a communicationally imbalanced protocol we will look at NIZKs. We consider the problem of proving in zero-knowledge the existence of exploits in executables compiled to run on real-world processors. Finally, we investigate the problem of constructing law enforcement access systems that mitigate the possibility of unauthorized surveillance, as a functionally imbalanced cryptographic protocol. We present two main constructions. The first construction enables prospective access, allowing surveillance only if encryption occurs after a warrant has been issued and activated. The second allows retrospective access to communications that occurred prior to a warrant’s issuance
    • …
    corecore