7 research outputs found

    Simple Schemes in the Bounded Storage Model

    Get PDF
    The bounded storage model promises unconditional security proofs against computationally unbounded adversaries, so long as the adversary’s space is bounded. In this work, we develop simple new constructions of two-party key agreement, bit commitment, and oblivious transfer in this model. In addition to simplicity, our constructions have several advantages over prior work, including an improved number of rounds and enhanced correctness. Our schemes are based on Raz’s lower bound for learning parities

    Memory-Sample Lower Bounds for Learning Parity with Noise

    Get PDF
    In this work, we show, for the well-studied problem of learning parity under noise, where a learner tries to learn x=(x1,,xn){0,1}nx=(x_1,\ldots,x_n) \in \{0,1\}^n from a stream of random linear equations over F2\mathrm{F}_2 that are correct with probability 12+ε\frac{1}{2}+\varepsilon and flipped with probability 12ε\frac{1}{2}-\varepsilon, that any learning algorithm requires either a memory of size Ω(n2/ε)\Omega(n^2/\varepsilon) or an exponential number of samples. In fact, we study memory-sample lower bounds for a large class of learning problems, as characterized by [GRT'18], when the samples are noisy. A matrix M:A×X{1,1}M: A \times X \rightarrow \{-1,1\} corresponds to the following learning problem with error parameter ε\varepsilon: an unknown element xXx \in X is chosen uniformly at random. A learner tries to learn xx from a stream of samples, (a1,b1),(a2,b2)(a_1, b_1), (a_2, b_2) \ldots, where for every ii, aiAa_i \in A is chosen uniformly at random and bi=M(ai,x)b_i = M(a_i,x) with probability 1/2+ε1/2+\varepsilon and bi=M(ai,x)b_i = -M(a_i,x) with probability 1/2ε1/2-\varepsilon (0<ε<120<\varepsilon< \frac{1}{2}). Assume that k,,rk,\ell, r are such that any submatrix of MM of at least 2kA2^{-k} \cdot |A| rows and at least 2X2^{-\ell} \cdot |X| columns, has a bias of at most 2r2^{-r}. We show that any learning algorithm for the learning problem corresponding to MM, with error, requires either a memory of size at least Ω(kε)\Omega\left(\frac{k \cdot \ell}{\varepsilon} \right), or at least 2Ω(r)2^{\Omega(r)} samples. In particular, this shows that for a large class of learning problems, same as those in [GRT'18], any learning algorithm requires either a memory of size at least Ω((logX)(logA)ε)\Omega\left(\frac{(\log |X|) \cdot (\log |A|)}{\varepsilon}\right) or an exponential number of noisy samples. Our proof is based on adapting the arguments in [Raz'17,GRT'18] to the noisy case.Comment: 19 pages. To appear in RANDOM 2021. arXiv admin note: substantial text overlap with arXiv:1708.0263

    Secure Multiparty Computation in the Bounded Storage Model

    Get PDF
    Most cryptography is based on assumptions such as factoring and discrete log, which assume an adversary has bounded computational power. With the recent development in quantum computing as well as concern with everlasting security, there is an interest in coming up with information-theoretic constructions in the bounded storage model. In this model, an adversary is computationally unbounded but has lim- ited space. Past works have constructed schemes such as key exchange and bit commitment in this model. In this work, we expand the function- alities further by building a semi-honest MPC protocol in the bounded storage model. We use the hardness of the parity learning problem (recently shown by Ran Raz (FOCS 16) without any cryptographic assump- tions) to prove the security of our construction, following the work by Guan and Zhandry (EUROCRYPT 19)

    Speak Much, Remember Little: Cryptography in the Bounded Storage Model, Revisited

    Get PDF
    The goal of the bounded storage model (BSM) is to construct unconditionally secure cryptographic protocols, by only restricting the storage capacity of the adversary, but otherwise giving it unbounded computational power. Here, we consider a streaming variant of the BSM, where honest parties can stream huge amounts of data to each other so as to overwhelm the adversary\u27s storage, even while their own storage capacity is significantly smaller than that of the adversary. Prior works showed several impressive results in this model, including key agreement and oblivious transfer, but only as long as adversary\u27s storage m=O(n2)m = O(n^2) is at most quadratically larger than the honest user storage nn. Moreover, the work of Dziembowski and Maurer (DM) also gave a seemingly matching lower bound, showing that key agreement in the BSM is impossible when m>n2m > n^2. In this work, we observe that the DM lower bound only applies to a significantly more restricted version of the BSM, and does not apply to the streaming variant. Surprisingly, we show that it is possible to construct key agreement and oblivious transfer protocols in the streaming BSM, where the adversary\u27s storage can be significantly larger, and even exponential m=2O(n)m = 2^{O(n)}. The only price of accommodating larger values of mm is that the round and communication complexities of our protocols grow accordingly, and we provide lower bounds to show that an increase in rounds and communication is necessary. As an added benefit of our work, we also show that our oblivious transfer (OT) protocol in the BSM satisfies a simulation-based notion of security. In contrast, even for the restricted case of m=O(n2)m = O(n^2), prior solutions only satisfied a weaker indistinguishability based definition. As an application of our OT protocol, we get general multiparty computation (MPC) in the BSM that allows for up to exponentially large gaps between mm and nn, while also achieving simulation-based security

    Authentication in the Bounded Storage Model

    Get PDF
    We consider the streaming variant of the Bounded Storage Model (BSM), where the honest parties can stream large amounts of data to each other, while only maintaining a small memory of size nn. The adversary also operates as a streaming algorithm, but has a much larger memory size mnm \gg n. The goal is to construct unconditionally secure cryptographic schemes in the BSM, and prior works did so for symmetric-key encryption, key agreement, oblivious transfer and multiparty computation. In this work, we construct message authentication and signatures in the BSM. First, we consider the symmetric-key setting, where Alice and Bob share a small secret key. Alice can authenticate arbitrarily many messages to Bob by streaming long authentication tags of size kmk \gg m, while ensuring that the tags can be generated and verified using only nn bits of memory. We show a solution using local extractors (Vadhan; JoC \u2704), which allows for up to exponentially large adversarial memory m=2O(n)m = 2^{O(n)}, and has tags of size k=O(m)k= O(m). Second, we consider the same setting as above, but now additionally require each individual tag to be small, of size knk \leq n. We show a solution is still possible when the adversary\u27s memory is m=O(n2)m = O(n^2), which is optimal. Our solution relies on a space lower bound for leaning parities (Raz; FOCS \u2716). Third, we consider the public-key signature setting. A signer Alice initially streams a long verification key over an authentic channel, while only keeping a short signing key in her memory. A verifier Bob receives the streamed verification key and generates some short verification digest that he keeps in his memory. Later, Alice can sign arbitrarily many messages using her signing key by streaming the signatures to Bob, who can verify them using his verification digest. We show a solution for m=O(n2)m= O(n^2), which we show to be optimal. Our solution relies on a novel entropy lemma, of independent interest. We show that, if a sequence of blocks has sufficiently high min-entropy, then a large fraction of individual blocks must have high min-entropy. Naive versions of this lemma are false, but we show how to patch it to make it hold

    Rate-1 Incompressible Encryption from Standard Assumptions

    Get PDF
    Incompressible encryption, recently proposed by Guan, Wichs and Zhandry (EUROCRYPT\u2722), is a novel encryption paradigm geared towards providing strong long-term security guarantees against adversaries with bounded long-term memory. Given that the adversary forgets just a small fraction of a ciphertext, this notion provides strong security for the message encrypted therein, even if, at some point in the future, the entire secret key is exposed. This comes at the price of having potentially very large ciphertexts. Thus, an important efficiency measure for incompressible encryption is the message-to-ciphertext ratio (also called the rate). Guan et al. provided a low-rate instantiation of this notion from standard assumptions and a rate-1 instantiation from indistinguishability obfuscation (iO). In this work, we propose a simple framework to build rate-1 incompressible encryption from standard assumptions. Our construction can be realized from, e.g. the DDH and additionally the DCR or the LWE assumptions

    Flexible Long-Term Secure Archiving

    Get PDF
    Privacy and data protection have always been basic human needs in any society that makes use of written language. From simple personal correspondence over military communication to trade secrets or medical information, confidentiality has been of utmost importance. The implications of a leak of such sensitive information may prove devastating, as the previous examples illustrate perfectly. Furthermore reliability, that is, integrity and authenticitiy of information, is critical with risks reaching from annoying to lethal as can again be seen in the previous examples. This need for data protection has carried over from the analogue to the digital age seamlessly with the amount of data being generated, transmitted and stored increasing steadily and containing more and more personal details. And in regard of the developments in computational technology that recent years have seen, such as the ongoing improvements with respect to quantum computing as well as cryptoanalytical advances, the capabilities of attackers on the security of private information have never been more distinct. Thus the need for privacy and data protection has rarely been more dire
    corecore