14 research outputs found
Composability in quantum cryptography
In this article, we review several aspects of composability in the context of
quantum cryptography. The first part is devoted to key distribution. We discuss
the security criteria that a quantum key distribution protocol must fulfill to
allow its safe use within a larger security application (e.g., for secure
message transmission). To illustrate the practical use of composability, we
show how to generate a continuous key stream by sequentially composing rounds
of a quantum key distribution protocol. In a second part, we take a more
general point of view, which is necessary for the study of cryptographic
situations involving, for example, mutually distrustful parties. We explain the
universal composability framework and state the composition theorem which
guarantees that secure protocols can securely be composed to larger
applicationsComment: 18 pages, 2 figure
Robust Quantum Public-Key Encryption with Applications to Quantum Key Distribution
Quantum key distribution (QKD) allows Alice and Bob to agree on a shared
secret key, while communicating over a public (untrusted) quantum channel.
Compared to classical key exchange, it has two main advantages: (i) The key is
unconditionally hidden to the eyes of any attacker, and (ii) its security
assumes only the existence of authenticated classical channels which, in
practice, can be realized using Minicrypt assumptions, such as the existence of
digital signatures. On the flip side, QKD protocols typically require multiple
rounds of interactions, whereas classical key exchange can be realized with the
minimal amount of two messages using public-key encryption. A long-standing
open question is whether QKD requires more rounds of interaction than classical
key exchange. In this work, we propose a two-message QKD protocol that
satisfies everlasting security, assuming only the existence of quantum-secure
one-way functions. That is, the shared key is unconditionally hidden, provided
computational assumptions hold during the protocol execution. Our result
follows from a new construction of quantum public-key encryption (QPKE) whose
security, much like its classical counterpart, only relies on authenticated
classical channels.Comment: 23 page
Non-Interactive Quantum Key Distribution
Quantum key distribution (QKD) allows Alice and Bob to agree on a shared secret key, while communicating over a public (untrusted) quantum channel.
Compared to classical key exchange, it has two main advantages:
(i)The key is unconditionally hidden to the eyes of any attacker, and
(ii) its security assumes only the existence of authenticated classical channels which, in practice, can be realized using Minicrypt assumptions, such as the existence of digital signatures.
On the flip side, QKD protocols typically require multiple rounds of interactions, whereas classical key exchange can be realized with the minimal amount of two messages. A long-standing open question is whether QKD requires more rounds of interaction than classical key exchange.
In this work, we propose a two-message QKD protocol that satisfies everlasting security, assuming only the existence of quantum-secure one-way functions.
That is, the shared key is unconditionally hidden, provided computational assumptions hold during the protocol execution. Our result follows from a new quantum cryptographic primitive that we introduce in this work: the quantum-public-key one-time pad, a public-key analogue of the well-known one-time pad
Publicly Verifiable Deletion from Minimal Assumptions
We present a general compiler to add the publicly verifiable deletion property for various cryptographic primitives including public key encryption, attribute-based encryption, and quantum fully homomorphic encryption. Our compiler only uses one-way functions, or more generally hard quantum planted problems for NP, which are implied by one-way functions. It relies on minimal assumptions and enables us to add the publicly verifiable deletion property with no additional assumption for the above primitives. Previously, such a compiler needs additional assumptions such as injective trapdoor one-way functions or pseudorandom group actions [Bartusek-Khurana-Poremba, ePrint:2023/370]. Technically, we upgrade an existing compiler for privately verifiable deletion [Bartusek-Khurana, ePrint:2022/1178] to achieve publicly verifiable deletion by using digital signatures
Unconditionally Secure and Universally Composable Commitments from Physical Assumptions
We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify the usefulness of our compiler by providing two (constant-round) instantiations of ideal straight-line extractable commitment based on (malicious) PUFs [36] and stateless tamper-proof hardware tokens [26], therefore achieving the first unconditionally UC-secure commitment with malicious PUFs and stateless tokens, respectively. Our constructions are secure for adversaries creating arbitrarily malicious stateful PUFs/tokens.
Previous results with malicious PUFs used either computational assumptions to achieve UC- secure commitments or were unconditionally secure but only in the indistinguishability sense [36]. Similarly, with stateless tokens, UC-secure commitments are known only under computational assumptions [13, 24, 15], while the (not UC) unconditional commitment scheme of [23] is secure only in a weaker model in which the adversary is not allowed to create stateful tokens.
Besides allowing us to prove feasibility of unconditional UC-security with (malicious) PUFs and stateless tokens, our compiler can be instantiated with any ideal straight-line extractable commitment scheme, thus allowing the use of various setup assumptions which may better fit the application or the technology available
Modeling Computational Security in Long-Lived Systems, Version 2
For many cryptographic protocols, security relies on the assumption that adversarial entities have limited computational power. This type of security degrades progressively over the lifetime of a protocol. However, some cryptographic services, such as timestamping services or digital archives, are long-lived in nature; they are expected to be secure and operational for a very long time (i.e., super-polynomial). In such cases, security cannot be guaranteed in the traditional sense: a computationally secure protocol may become insecure if the attacker has a super-polynomial number of interactions with the protocol. This paper proposes a new paradigm for the analysis of long-lived security protocols. We allow entities to be active for a potentially unbounded amount of real time, provided they perform only a polynomial amount of work per unit of real time. Moreover, the space used by these entities is allocated dynamically and must be polynomially bounded. We propose a new notion of long-term implementation, which is an adaptation of computational indistinguishability to the long-lived setting. We show that long-term implementation is preserved under polynomial parallel composition and exponential sequential composition. We illustrate the use of this new paradigm by analyzing some security properties of the long-lived timestamping protocol of Haber and Kamat
The Communication Complexity of Threshold Private Set Intersection
Threshold private set intersection enables Alice and Bob who hold sets and of size to compute the intersection if the sets do not differ by more than some threshold parameter .
In this work, we investigate the communication complexity of this problem and we establish the first upper and lower bounds.
We show that any protocol has to have a communication complexity of .
We show that an almost matching upper bound of can be obtained via fully homomorphic encryption.
We present a computationally more efficient protocol based on weaker assumptions, namely additively homomorphic encryption, with a communication complexity of .
We show how our protocols can be extended to the multiparty setting.
For applications like biometric authentication, where a given fingerprint has to have a large intersection with a fingerprint from a database, our protocols may result in significant communication savings.
We, furthermore, show how to extend all of our protocols to the multiparty setting.
Prior to this work, all previous protocols had a communication complexity of .
Our protocols are the first ones with communication complexities that mainly depend on the threshold parameter and only logarithmically on the set size
Resettable Statistical Zero-Knowledge for NP
Resettable statistical zero-knowledge [Garg--Ostrovsky--Visconti--Wadia, TCC 2012] is a strong privacy notion that guarantees statistical zero-knowledge even when the prover uses the same randomness in multiple proofs.
In this paper, we show an equivalence of resettable statistical zero-knowledge arguments for and witness encryption schemes for .
- Positive result: For any language , a resettable statistical zero-knowledge argument for can be constructed from a witness encryption scheme for under the assumption of the existence of one-way functions.
- Negative result: The existence of even resettable statistical witness-indistinguishable arguments for imply the existence of witness encryption schemes for under the assumption of the existence of one-way functions.
The positive result is obtained by naturally extending existing techniques (and is likely to be already well-known among experts). The negative result is our main technical contribution.
To explore workarounds for the negative result, we also consider resettable security in a model where the honest party\u27s randomness is only reused with fixed inputs. We show that resettable statistically hiding commitment schemes are impossible even in this model
Modeling Computational Security in Long-Lived Systems
For many cryptographic protocols, security relies on the assumption that adversarial entities have limited computational power. This type of security degrades progressively over the lifetime of a protocol. However, some cryptographic services, such as timestamping services or digital archives, are long-lived in nature; they are expected to be secure and operational for a very long time (i.e., super-polynomial). In such cases, security cannot be guaranteed in the traditional sense: a computationally secure protocol may become insecure if the attacker has a super-polynomial number of interactions with the protocol.
This paper proposes a new paradigm for the analysis of long-lived security protocols. We allow entities to be active for a potentially unbounded amount of real time, provided they perform only a polynomial amount of work per unit of real time. Moreover, the space used by these entities is allocated dynamically and must be polynomially bounded. We propose a new notion of long-term implementation, which is an adaptation of computational indistinguishability to the long-lived setting. We show that long-term implementation is preserved under polynomial parallel composition and exponential sequential composition. We illustrate the use of this new paradigm by analyzing some security properties of the long-lived timestamping protocol of Haber and Kamat