92 research outputs found
Practical Dual-Receiver Encryption---Soundness, Complete Non-Malleability, and Applications
We reformalize and recast dual-receiver encryption (DRE) proposed in CCS \u2704, a public-key encryption (PKE) scheme for encrypting to two independent recipients in one shot. We start by defining the crucial soundness property for DRE, which ensures that two recipients will get the same decryption result. While conceptually simple, DRE with soundness turns out to be a powerful primitive for various goals for PKE, such as complete non-malleability (CNM) and plaintext-awareness (PA). We then construct practical DRE schemes without random oracles under the Bilinear Decisional Diffie-Hellman assumption, while prior approaches rely on random oracles or inefficient non-interactive zero-knowledge proofs. Finally, we investigate further applications or extensions of DRE, including DRE with CNM, combined use of DRE and PKE, strengthening two types of PKE schemes with plaintext equality test, off-the-record messaging with a stronger notion of deniability, etc
Post-Quantum Multi-Party Computation
We initiate the study of multi-party computation for classical functionalities (in the plain model) with security against malicious polynomial-time quantum adversaries. We observe that existing techniques readily give a polynomial-round protocol, but our main result is a construction of constant-round post-quantum multi-party computation. We assume mildly super-polynomial quantum hardness of learning with errors (LWE), and polynomial quantum hardness of an LWE-based circular security assumption. Along the way, we develop the following cryptographic primitives that may be of independent interest:
- A spooky encryption scheme for relations computable by quantum circuits, from the quantum hardness of an LWE-based circular security assumption. This yields the first quantum multi-key fully-homomorphic encryption scheme with classical keys.
- Constant-round zero-knowledge secure against multiple parallel quantum verifiers from spooky encryption for relations computable by quantum circuits. To enable this, we develop a new straight-line non-black-box simulation technique against parallel verifiers that does not clone the adversary\u27s state. This forms the heart of our technical contribution and may also be relevant to the classical setting.
- A constant-round post-quantum non-malleable commitment scheme, from the mildly super-polynomial quantum hardness of LWE
Round and computational efficiency of two-party protocols
2016 - 2017A cryptographic protocol is defined by the behaviour of the involved parties and the messages
that those parties send to each other. Beside the functionality and the security that a cryptographic
protocol provides, it is also important that the protocol is efficient. In this thesis
we focus on the efficiency parameters of a cryptographic protocol related to the computational
and round complexity. That is, we are interested in the computational cost that the parties
involved in the protocol have to pay and how many interactions between the parties are required
to securely implement the functionality which we are interested in. Another important aspect
of a cryptographic protocol is related to the computational assumptions required to prove that
the protocol is secure. The aim of this thesis is to improve the state of the art with respect to
some cryptographic functionalities where two parties are involved, by providing new techniques
to construct more efficient cryptographic protocols whose security can be proven by relying on
better cryptographic assumptions.
The thesis is divided in three parts. In the first part we consider Secure Two-Party Computation
(2PC), a cryptographic technique that allows to compute a functionality in a secure
way. More precisely, there are two parties, Alice and Bob, willing to compute the output of a
function f given x and y as input. The values x and y represent the inputs of Alice and Bob
respectively. Moreover, each party wants to keep the input secret while allowing the other party
to correctly compute f(x, y). As a first result, we show the first secure 2PC protocol with black
box simulation, secure under standard and generic assumption, with optimal round complexity
in the simultaneous message exchange model. In the simultaneous message exchange model both
parties can send a message in each round; in the rest of this thesis we assume the in each round
only one party can send a message.
We advance the state of the art in secure 2PC also in a relaxed setting. More precisely, in this
setting a malicious party that attacks the protocol to understand the secret input of the honest
party, is forced to follow the protocol description. Moreover, we consider the case in which the
parties want to compute in a secure way the Set-Membership functionality. Such a functionality
allows to check whether an element belongs to a set or not. The proposed protocol improves the
state of the art both in terms of performance and generality. In the second part of the thesis
we show the first 4-round concurrent non-malleable commitment under one-way functions. A
commitment scheme allows the sender to send an encrypted message, called commitment, in
such a way that the message inside the commitment cannot be opened until that an opening
information is provided by the sender. Moreover, there is a unique way in which the commitment
can be open. In this thesis we consider the case in which the sender sends the commitment (e.g.
trough a computer network) that can be eavesdropped by an adversary. In this setting the
adversary can catch the commitment C and modify it thus obtaining a new commitment C0
that contains a message related to the content of C. A non-malleable commitment scheme
prevents such attack, and our scheme can be proved secure even in the case that the adversary
can eavesdrop multiple commitments and in turn, compute and send multiple commitments.
The last part of the thesis concerns proof systems. Let us consider an NP-language, like
the language of graph Hamiltonicity. A proof system allows an entity called prover to prove
that a certain graph (instance) contains a Hamiltonian cycle (witness) to another entity called
verifier. A proof system can be easily instantiated in one round by letting the prover to send
the cycle to the verifier. What we actually want though, is a protocol in which the prover is able
to convince the verifier that a certain graph belongs to the language of graph Hamiltonicity, but
in such a way that no information about the cycle is leaked to the verifier. This kind of proof
systems are called Zero Knowledge. In this thesis we show a non-interactive Zero-Knowledge
proof system, under the assumption that both prover and verifier have access to some honestly
generated common reference string (CRS). The provided construction improves the state of the
art both in terms of efficiency and generality. We consider also the scenario in which prover
and verifier do not have access to some honestly generated information and study the notion of
Witness Indistinguishability. This notion considers instances that admit more than one witness,
e.g. graphs that admit two distinct Hamiltonian cycle (as for the notion of Zero Knowledge,
the notion of Witness Indistinguishability makes sense for all the languages in NP, but for
ease of exposition we keep focusing our attention of the language of graph Hamiltonicity). The
security notion of Witness-Indistinguishability ensures that a verifier, upon receiving a proof
from a prover, is not able to figure out which one of the two Hamiltonian cycles has been used
by the prover to compute the proof. Even though the notion of Witness Indistinguishability is
weaker than the notion of Zero Knowledge, Witness Indistinguishability is widely used in many
cryptographic applications. Moreover, given that a Witness-Indistinguishable protocol can be
constructed using just three rounds of communication compared to the four rounds required to
obtain Zero Knowledge (with black-box simulation), the use of Zero-Knowledge as a building
block to construct a protocol with an optimal number of rounds is sometimes prohibitive. Always
in order to provide a good building block to construct more complicated cryptographic protocols
with a nice round complexity, a useful property is the so called Delayed-Input property. This
property allows the prover to compute all but the last round of the protocol without knowing
the instance nor the witness. Also, the Delayed-Input property allows the verifier to interact
with the prover without knowing the instance at all (i.e. the verifier needs the instance just to
decide whether to accept or not the proof received by the prover). In this thesis we provide the
first efficient Delayed-Input Witness-Indistinguishable proof system that consists of just three
round of communication. [edited by author]XVI n.s
Chosen-Ciphertext Secure Dual-Receiver Encryption in the Standard Model Based on Post-Quantum Assumptions
Dual-receiver encryption (DRE) is a special form of public key encryption (PKE) that allows a sender to encrypt a message for two recipients. Without further properties, the difference between DRE and PKE is only syntactical. One such important property is soundness, which requires that no ciphertext can be constructed such that the recipients decrypt to different plaintexts. Many applications rely on this property in order to realize more complex protocols or primitives. In addition, many of these applications explicitly avoid the usage of the random oracle, which poses an additional requirement on a DRE construction. We show that all of the IND-CCA2 secure standard model DRE constructions based on post-quantum assumptions fall short of augmenting the constructions with soundness and describe attacks thereon.
We then give an overview over all applications of IND-CCA2 secure DRE, group them into generic (i. e., applications using DRE as black-box) and non-generic applications and demonstrate that all generic ones require either soundness or public verifiability.
Conclusively, we identify the gap of sound and IND-CCA2 secure DRE constructions based on post-quantum assumptions in the standard Model.
In order to fill this gap we provide two IND-CCA2 secure DRE constructions based on the standard post-quantum assumptions, Normal Form Learning With Errors (NLWE) and Learning Parity with Noise (LPN)
Sender-binding Key Encapsulation
Secure communication is gained by combining encryption with authentication. In real-world applications encryption commonly takes the form of KEM-DEM hybrid encryption, which is combined with ideal authentication. The pivotal question is how weak the employed key encapsulation mechanism (KEM) is allowed to be to still yield universally composable (UC) secure communication when paired with symmetric encryption and ideal authentication. This question has so far been addressed for public-key encryption (PKE) only, showing that encryption does not need to be stronger than sender-binding CPA, which binds the CPA secure ciphertext non-malleably to the sender ID. For hybrid encryption, prior research unanimously reaches for CCA2 secure encryption which is unnecessarily strong. Answering this research question is vital to develop more efficient and feasible protocols for real-world secure communication and thus enable more communication to be conducted securely.
In this paper we use ideas from the PKE setting to develop new answers for hybrid encryption. We develop a new and significantly weaker security notion—sender-binding CPA for KEMs—which is still strong enough for secure communication. By using game-based notions as building blocks, we attain secure communication in the form of ideal functionalities with proofs in the UC-framework. Secure communication is reached in both the classic as well as session context by adding authentication and one-time/replayable CCA secure symmetric encryption respectively. We furthermore provide an efficient post-quantum secure LWE-based construction in the standard model giving an indication of the real-world benefit resulting from our new security notion. Overall we manage to make significant progress on discovering the minimal security requirements for hybrid encryption components to facilitate secure communication
Hierarchical Integrated Signature and Encryption
In this work, we introduce the notion of hierarchical integrated signature and encryption (HISE), wherein a single public key is used for both signature and encryption, and one can derive a secret key used only for decryption from the signing key,
which enables secure delegation of decryption capability. HISE enjoys the benefit of key reuse, and admits individual key escrow. We present two generic constructions of HISE. One is from (constrained) identity-based encryption. The other is from uniform one-way function, public-key encryption, and general-purpose public-coin zero-knowledge proof of knowledge. To further attain global key escrow, we take a little detour to revisit global escrow PKE, an object both of independent interest and with many applications. We formalize the syntax and security model of global escrow PKE, and provide two generic constructions. The first embodies a generic approach to compile any PKE into one with global escrow property. The second establishes a connection between three-party non-interactive key exchange and global escrow PKE.
Combining the results developed above, we obtain HISE schemes that support both individual and global key escrow.
We instantiate our generic constructions of (global escrow) HISE and implement all the resulting concrete schemes for 128-bit security. Our schemes have performance that is comparable to the best Cartesian product combined public-key scheme, and exhibit advantages in terms of richer functionality and public key reuse. As a byproduct, we obtain a new global escrow PKE scheme that is faster than the best prior work, which might be of independent interest
Pursuing the Limits of Cryptography
Modern cryptography has gone beyond traditional notions of encryption, allowing for new applications such as digital signatures, software obfuscation among others. While cryptography might seem like a magical tool for one's privacy needs, there are mathematical limitations to what cryptography can achieve. In this thesis we focus on understanding what lies on the boundary of what cryptography enables. In particular, we focus on three specific aspects that we elaborate on below.
Necessity of Randomness in Zero-Knowledge Protocols: A Zero-Knowledge protocol consists of an interaction between two parties, designated prover and verifier, where the prover is trying to convince the verifier of the validity of a statement without revealing anything beyond the validity. We study the necessity of randomness, a scarce resource, in such protocols. Prior works have shown that for most settings, the prover necessarily *requires* randomness to run any such protocol. We show, somewhat surprisingly, one can design protocols where a prover requires *no* randomness.
Minimizing Interaction in Secure Computation Protocols: The next part of the thesis focuses on one of the most general notions in cryptography, that of *secure computation*. It allows mutually distrusting parties to jointly compute a function over a network without revealing anything but the output of the computation. Considering that these protocols are going to be run on high-latency networks such as the internet, it is imperative that we design protocols to minimize the interaction between participants of the protocol. Prior works have established lower bounds on the amount of interaction, and in our work we show that these lower bounds are tight by constructing new protocols that are also optimal in their assumptions.
Circumventing Impossibilities with Blockchains: In some cases, there are desired usages of secure computations protocols that are provably impossible on the (regular) Internet, i.e. existing protocols can no longer be proven secure when multiple concurrent instances of the protocol are executed. We show that by assuming the existence of a secure blockchain, a minimal additional trust assumption, we can push past the boundaries of what is cryptographically possible by constructing *new* protocols that are provably secure on the Internet
Recommended from our members
On Black-Box Complexity and Adaptive, Universal Composability of Cryptographic Tasks
Two main goals of modern cryptography are to identify the minimal assumptions necessary to construct secure cryptographic primitives as well as to construct secure protocols in strong and realistic adversarial models. In this thesis, we address both of these fundamental questions. In the first part of this thesis, we present results on the black-box complexity of two basic cryptographic primitives: non-malleable encryption and optimally-fair coin tossing. Black-box reductions are reductions in which both the underlying primitive as well as the adversary are accessed only in an input-output (or black-box) manner. Most known cryptographic reductions are black-box. Moreover, black-box reductions are typically more efficient than non-black-box reductions. Thus, the black-box complexity of cryptographic primitives is a meaningful and important area of study which allows us to gain insight into the primitive. We study the black box complexity of non-malleable encryption and optimally-fair coin tossing, showing a positive result for the former and a negative one for the latter. Non-malleable encryption is a strong security notion for public-key encryption, guaranteeing that it is impossible to "maul" a ciphertext of a message m into a ciphertext of a related message. This security guarantee is essential for many applications such as auctions. We show how to transform, in a black-box manner, any public-key encryption scheme satisfying a weak form of security, semantic security, to a scheme satisfying non-malleability. Coin tossing is perhaps the most basic cryptographic primitive, allowing two distrustful parties to flip a coin whose outcome is 0 or 1 with probability 1/2. A fair coin tossing protocol is one in which the outputted bit is unbiased, even in the case where one of the parties may abort early. However, in the setting where parties may abort early, there is always a strategy for one of the parties to impose bias of Omega(1/r) in an r-round protocol. Thus, achieving bias of O(1/r) in r rounds is optimal, and it was recently shown that optimally-fair coin tossing can be achieved via a black-box reduction to oblivious transfer. We show that it cannot be achieved via a black-box reduction to one-way function, unless the number of rounds is at least Omega(n/log n), where n is the input/output length of the one-way function. In the second part of this thesis, we present protocols for multiparty computation (MPC) in the Universal Composability (UC) model that are secure against malicious, adaptive adversaries. In the standard model, security is only guaranteed in a stand-alone setting; however, nothing is guaranteed when multiple protocols are arbitrarily composed. In contrast, the UC model, introduced by (Canetti, 2000), considers the execution of an unbounded number of concurrent protocols, in an arbitrary, and adversarially controlled network environment. Another drawback of the standard model is that the adversary must decide which parties to corrupt before the execution of the protocol commences. A more realistic model allows the adversary to adaptively choose which parties to corrupt based on its evolving view during the protocol. In our work we consider the the adaptive UC model, which combines these two security requirements by allowing both arbitrary composition of protocols and adaptive corruption of parties. In our first result, we introduce an improved, efficient construction of non-committing encryption (NCE) with optimal round complexity, from a weaker primitive we introduce called trapdoor-simulatable public key encryption (PKE). NCE is a basic primitive necessary to construct protocols secure under adaptive corruptions and in particular, is used to construct oblivious transfer (OT) protocols secure against semi-honest, adaptive adversaries. Additionally, we show how to realize trapdoor-simulatable PKE from hardness of factoring Blum integers, thus achieving the first construction of NCE from hardness of factoring. In our second result, we present a compiler for transforming an OT protocol secure against a semi-honest, adaptive adversary into one that is secure against a malicious, adaptive adversary. Our compiler achieves security in the UC model, assuming access to an ideal commitment functionality, and improves over previous work achieving the same security guarantee in two ways: it uses black-box access to the underlying protocol and achieves a constant multiplicative overhead in the round complexity. Combining our two results with the work of (Ishai et al., 2008), we obtain the first black-box construction of UC and adaptively secure MPC from trapdoor-simulatable PKE and the ideal commitment functionality
- …