210 research outputs found

    Placing Conditional Disclosure of Secrets in the Communication Complexity Universe

    Get PDF
    In the conditional disclosure of secrets (CDS) problem (Gertner et al., J. Comput. Syst. Sci., 2000) Alice and Bob, who hold n-bit inputs x and y respectively, wish to release a common secret z to Carol (who knows both x and y) if and only if the input (x,y) satisfies some predefined predicate f. Alice and Bob are allowed to send a single message to Carol which may depend on their inputs and some shared randomness, and the goal is to minimize the communication complexity while providing information-theoretic security. Despite the growing interest in this model, very few lower-bounds are known. In this paper, we relate the CDS complexity of a predicate f to its communication complexity under various communication games. For several basic predicates our results yield tight, or almost tight, lower-bounds of Omega(n) or Omega(n^{1-epsilon}), providing an exponential improvement over previous logarithmic lower-bounds. We also define new communication complexity classes that correspond to different variants of the CDS model and study the relations between them and their complements. Notably, we show that allowing for imperfect correctness can significantly reduce communication - a seemingly new phenomenon in the context of information-theoretic cryptography. Finally, our results show that proving explicit super-logarithmic lower-bounds for imperfect CDS protocols is a necessary step towards proving explicit lower-bounds against the class AM, or even AM cap coAM - a well known open problem in the theory of communication complexity. Thus imperfect CDS forms a new minimal class which is placed just beyond the boundaries of the "civilized" part of the communication complexity world for which explicit lower-bounds are known

    Separating Two-Round Secure Computation From Oblivious Transfer

    Get PDF
    We consider the question of minimizing the round complexity of protocols for secure multiparty computation (MPC) with security against an arbitrary number of semi-honest parties. Very recently, Garg and Srinivasan (Eurocrypt 2018) and Benhamouda and Lin (Eurocrypt 2018) constructed such 2-round MPC protocols from minimal assumptions. This was done by showing a round preserving reduction to the task of secure 2-party computation of the oblivious transfer functionality (OT). These constructions made a novel non-black-box use of the underlying OT protocol. The question remained whether this can be done by only making black-box use of 2-round OT. This is of theoretical and potentially also practical value as black-box use of primitives tends to lead to more efficient constructions. Our main result proves that such a black-box construction is impossible, namely that non-black-box use of OT is necessary. As a corollary, a similar separation holds when starting with any 2-party functionality other than OT. As a secondary contribution, we prove several additional results that further clarify the landscape of black-box MPC with minimal interaction. In particular, we complement the separation from 2-party functionalities by presenting a complete 4-party functionality, give evidence for the difficulty of ruling out a complete 3-party functionality and for the difficulty of ruling out black-box constructions of 3-round MPC from 2-round OT, and separate a relaxed "non-compact" variant of 2-party homomorphic secret sharing from 2-round OT

    Statistically Sender-Private OT from LPN and Derandomization

    Get PDF
    We construct a two-message oblivious transfer protocol with statistical sender privacy (SSP OT) based on the Learning Parity with Noise (LPN) Assumption and a standard Nisan-Wigderson style derandomization assumption. Beyond being of interest on their own, SSP OT protocols have proven to be a powerful tool toward minimizing the round complexity in a wide array of cryptographic applications from proofs systems, through secure computation protocols, to hard problems in statistical zero knowledge (SZK). The protocol is plausibly post-quantum secure. The only other constructions with plausible post quantum security are based on the Learning with Errors (LWE) Assumption. Lacking the geometric structure of LWE, our construction and analysis rely on a different set of techniques. Technically, we first construct an SSP OT protocol in the common random string model from LPN alone, and then derandomize the common random string. Most of the technical difficulty lies in the first step. Here we prove a robustness property of the inner product randomness extractor to a certain type of linear splitting attacks. A caveat of our construction is that it relies on the so called low noise regime of LPN. This aligns with our current complexity-theoretic understanding of LPN, which only in the low noise regime is known to imply hardness in SZK

    Replacing Probability Distributions in Security Games via Hellinger Distance

    Get PDF
    Security of cryptographic primitives is usually proved by assuming "ideal" probability distributions. We need to replace them with approximated "real" distributions in the real-world systems without losing the security level. We demonstrate that the Hellinger distance is useful for this problem, while the statistical distance is mainly used in the cryptographic literature. First, we show that for preserving ?-bit security of a given security game, the closeness of 2^{-?/2} to the ideal distribution is sufficient for the Hellinger distance, whereas 2^{-?} is generally required for the statistical distance. The result can be applied to both search and decision primitives through the bit security framework of Micciancio and Walter (Eurocrypt 2018). We also show that the Hellinger distance gives a tighter evaluation of closeness than the max-log distance when the distance is small. Finally, we show that the leftover hash lemma can be strengthened to the Hellinger distance. Namely, a universal family of hash functions gives a strong randomness extractor with optimal entropy loss for the Hellinger distance. Based on the results, a ?-bit entropy loss in randomness extractors is sufficient for preserving ?-bit security. The current understanding based on the statistical distance is that a 2?-bit entropy loss is necessary

    Rational Broadcast Protocols against Timid Adversaries

    Get PDF
    We present a constant-round deterministic broadcast protocol against timid adversaries in the synchronous authenticated setting. A timid adversary is a game-theoretically rational adversary who tries to attack the protocol but prefers the actions to be undetected. Our protocol is secure against such an adversary corrupting t out of n parties for any t < n. The round complexity is 5 for timid adversaries and is at most t + 5 for general malicious adversaries. Our results demonstrate that game-theoretic rationality enables us to circumvent the impossibility of constructing constant-round deterministic broadcast protocols for t = ω(1)

    On Round Optimal Statistical Zero Knowledge Arguments

    Get PDF
    We construct the first three message statistical zero knowledge arguments for all of NP, matching the known lower bound. We do so based on keyless multi-collision resistant hash functions and other standard primitives (based on the Learning with Errors assumption) --- the same assumptions used to obtain round optimal computational zero knowledge. The main component in our constructions is a statistically witness indistinguishable argument of knowledge based on a new notion of statistically hiding commitments with subset opening

    The Gap Is Sensitive to Size of Preimages: Collapsing Property Doesn\u27t Go Beyond Quantum Collision-Resistance for Preimages Bounded Hash Functions

    Get PDF
    As an enhancement of quantum collision-resistance, the collapsing property of hash functions proposed by Unruh (EUROCRYPT 2016) emphasizes the hardness for distinguishing a superposition state of a hash value from a collapsed one. The collapsing property trivially implies the quantum collision-resistance. However, it remains to be unknown whether there is a reduction from the collapsing hash functions to the quantum collision-resistant hash functions. In this paper, we further study the relations between these two properties and derive two intriguing results as follows: Firstly, when the size of preimages of each hash value is bounded by some polynomial, we demonstrate that the collapsing property and the collision-resistance must hold simultaneously. This result is proved via a semi-black-box manner by taking advantage of the invertibility of a unitary quantum circuit. Next, we further consider the relations between these two properties in the exponential-sized preimages case. By giving a construction of polynomial bounded hash functions, which preserves the quantum collision-resistance, we show the existence of collapsing hash functions is implied by the quantum collision-resistant hash functions when the size of preimages is not too large to the expected value. Our results indicate that the gap between these two properties is sensitive to the size of preimages. As a corollary, our results also reveal the non-existence of polynomial bounded equivocal collision-resistant hash functions

    More Efficient Commitments from Structured Lattice Assumptions

    Get PDF
    We present a practical construction of an additively homomorphic commitment scheme based on structured lattice assumptions, together with a zero-knowledge proof of opening knowledge. Our scheme is a design improvement over the previous work of Benhamouda et al. in that it is not restricted to being statistically binding. While it is possible to instantiate our scheme to be statistically binding or statistically hiding, it is most efficient when both hiding and binding properties are only computational. This results in approximately a factor of 4 reduction in the size of the proof and a factor of 6 reduction in the size of the commitment over the aforementioned scheme
    • …
    corecore