900 research outputs found

    Random Oracles in a Quantum World

    Get PDF
    The interest in post-quantum cryptography - classical systems that remain secure in the presence of a quantum adversary - has generated elegant proposals for new cryptosystems. Some of these systems are set in the random oracle model and are proven secure relative to adversaries that have classical access to the random oracle. We argue that to prove post-quantum security one needs to prove security in the quantum-accessible random oracle model where the adversary can query the random oracle with quantum states. We begin by separating the classical and quantum-accessible random oracle models by presenting a scheme that is secure when the adversary is given classical access to the random oracle, but is insecure when the adversary can make quantum oracle queries. We then set out to develop generic conditions under which a classical random oracle proof implies security in the quantum-accessible random oracle model. We introduce the concept of a history-free reduction which is a category of classical random oracle reductions that basically determine oracle answers independently of the history of previous queries, and we prove that such reductions imply security in the quantum model. We then show that certain post-quantum proposals, including ones based on lattices, can be proven secure using history-free reductions and are therefore post-quantum secure. We conclude with a rich set of open problems in this area.Comment: 38 pages, v2: many substantial changes and extensions, merged with a related paper by Boneh and Zhandr

    Public Key Encryption Supporting Plaintext Equality Test and User-Specified Authorization

    Get PDF
    In this paper we investigate a category of public key encryption schemes which supports plaintext equality test and user-specified authorization. With this new primitive, two users, who possess their own public/private key pairs, can issue token(s) to a proxy to authorize it to perform plaintext equality test from their ciphertexts. We provide a formal formulation for this primitive, and present a construction with provable security in our security model. To mitigate the risks against the semi-trusted proxies, we enhance the proposed cryptosystem by integrating the concept of computational client puzzles. As a showcase, we construct a secure personal health record application based on this primitive

    Naor-Yung paradigm with shared randomness and applications

    Get PDF
    The Naor-Yung paradigm (Naor and Yung, STOC’90) allows to generically boost security under chosen-plaintext attacks (CPA) to security against chosen-ciphertext attacks (CCA) for public-key encryption (PKE) schemes. The main idea is to encrypt the plaintext twice (under independent public keys), and to append a non-interactive zero-knowledge (NIZK) proof that the two ciphertexts indeed encrypt the same message. Later work by Camenisch, Chandran, and Shoup (Eurocrypt’09) and Naor and Segev (Crypto’09 and SIAM J. Comput.’12) established that the very same techniques can also be used in the settings of key-dependent message (KDM) and key-leakage attacks (respectively). In this paper we study the conditions under which the two ciphertexts in the Naor-Yung construction can share the same random coins. We find that this is possible, provided that the underlying PKE scheme meets an additional simple property. The motivation for re-using the same random coins is that this allows to design much more efficient NIZK proofs. We showcase such an improvement in the random oracle model, under standard complexity assumptions including Decisional Diffie-Hellman, Quadratic Residuosity, and Subset Sum. The length of the resulting ciphertexts is reduced by 50%, yielding truly efficient PKE schemes achieving CCA security under KDM and key-leakage attacks. As an additional contribution, we design the first PKE scheme whose CPA security under KDM attacks can be directly reduced to (low-density instances of) the Subset Sum assumption. The scheme supports keydependent messages computed via any affine function of the secret ke

    Design and Analysis of Opaque Signatures

    Get PDF
    Digital signatures were introduced to guarantee the authenticity and integrity of the underlying messages. A digital signature scheme comprises the key generation, the signature, and the verification algorithms. The key generation algorithm creates the signing and the verifying keys, called also the signer’s private and public keys respectively. The signature algorithm, which is run by the signer, produces a signature on the input message. Finally, the verification algorithm, run by anyone who knows the signer’s public key, checks whether a purported signature on some message is valid or not. The last property, namely the universal verification of digital signatures is undesirable in situations where the signed data is commercially or personally sensitive. Therefore, mechanisms which share most properties with digital signatures except for the universal verification were invented to respond to the aforementioned need; we call such mechanisms “opaque signatures”. In this thesis, we study the signatures where the verification cannot be achieved without the cooperation of a specific entity, namely the signer in case of undeniable signatures, or the confirmer in case of confirmer signatures; we make three main contributions. We first study the relationship between two security properties important for public key encryption, namely data privacy and key privacy. Our study is motivated by the fact that opaque signatures involve always an encryption layer that ensures their opacity. The properties required for this encryption vary according to whether we want to protect the identity (i.e. the key) of the signer or hide the validity of the signature. Therefore, it would be convenient to use existing work about the encryption scheme in order to derive one notion from the other. Next, we delve into the generic constructions of confirmer signatures from basic cryptographic primitives, e.g. digital signatures, encryption, or commitment schemes. In fact, generic constructions give easy-to-understand and easy-to-prove schemes, however, this convenience is often achieved at the expense of efficiency. In this contribution, which constitutes the core of this thesis, we first analyze the already existing constructions; our study concludes that the popular generic constructions of confirmer signatures necessitate strong security assumptions on the building blocks, which impacts negatively the efficiency of the resulting signatures. Next, we show that a small change in these constructionsmakes these assumptions drop drastically, allowing as a result constructions with instantiations that compete with the dedicated realizations of these signatures. Finally, we revisit two early undeniable signatures which were proposed with a conjectural security. We disprove the claimed security of the first scheme, and we provide a fix to it in order to achieve strong security properties. Next, we upgrade the second scheme so that it supports a iii desirable feature, and we provide a formal security treatment of the new scheme: we prove that it is secure assuming new reasonable assumptions on the underlying constituents

    New-Age Cryptography

    Get PDF
    We introduce new and general complexity theoretic hardness assumptions. These assumptions abstract out concrete properties of a random oracle and are significantly stronger than traditional cryptographic hardness assumptions; however, assuming their validity we can resolve a number of longstandingopen problems in cryptography
    corecore