7 research outputs found
Simple Schemes in the Bounded Storage Model
The bounded storage model promises unconditional security
proofs against computationally unbounded adversaries, so long as the
adversary’s space is bounded. In this work, we develop simple new constructions
of two-party key agreement, bit commitment, and oblivious
transfer in this model. In addition to simplicity, our constructions have
several advantages over prior work, including an improved number of
rounds and enhanced correctness. Our schemes are based on Raz’s lower
bound for learning parities
Memory-Sample Lower Bounds for Learning Parity with Noise
In this work, we show, for the well-studied problem of learning parity under
noise, where a learner tries to learn from a
stream of random linear equations over that are correct with
probability and flipped with probability
, that any learning algorithm requires either a memory
of size or an exponential number of samples.
In fact, we study memory-sample lower bounds for a large class of learning
problems, as characterized by [GRT'18], when the samples are noisy. A matrix
corresponds to the following learning
problem with error parameter : an unknown element is
chosen uniformly at random. A learner tries to learn from a stream of
samples, , where for every , is
chosen uniformly at random and with probability
and with probability
(). Assume that are such that any
submatrix of of at least rows and at least columns, has a bias of at most . We show that any learning
algorithm for the learning problem corresponding to , with error, requires
either a memory of size at least , or at least samples. In particular, this shows that
for a large class of learning problems, same as those in [GRT'18], any learning
algorithm requires either a memory of size at least or an exponential number of noisy
samples.
Our proof is based on adapting the arguments in [Raz'17,GRT'18] to the noisy
case.Comment: 19 pages. To appear in RANDOM 2021. arXiv admin note: substantial
text overlap with arXiv:1708.0263
Secure Multiparty Computation in the Bounded Storage Model
Most cryptography is based on assumptions such as factoring and discrete log, which assume an adversary has bounded computational power. With the recent development in quantum computing as well as concern with everlasting security, there is an interest in coming up with information-theoretic constructions in the bounded storage model.
In this model, an adversary is computationally unbounded but has lim- ited space. Past works have constructed schemes such as key exchange and bit commitment in this model. In this work, we expand the function- alities further by building a semi-honest MPC protocol in the bounded storage model. We use the hardness of the parity learning problem (recently shown by Ran Raz (FOCS 16) without any cryptographic assump- tions) to prove the security of our construction, following the work by Guan and Zhandry (EUROCRYPT 19)
Speak Much, Remember Little: Cryptography in the Bounded Storage Model, Revisited
The goal of the bounded storage model (BSM) is to construct unconditionally secure cryptographic protocols, by only restricting the storage capacity of the adversary, but otherwise giving it unbounded computational power. Here, we consider a streaming variant of the BSM, where honest parties can stream huge amounts of data to each other so as to overwhelm the adversary\u27s storage, even while their own storage capacity is significantly smaller than that of the adversary. Prior works showed several impressive results in this model, including key agreement and oblivious transfer, but only as long as adversary\u27s storage is at most quadratically larger than the honest user storage . Moreover, the work of Dziembowski and Maurer (DM) also gave a seemingly matching lower bound, showing that key agreement in the BSM is impossible when .
In this work, we observe that the DM lower bound only applies to a significantly more restricted version of the BSM, and does not apply to the streaming variant. Surprisingly, we show that it is possible to construct key agreement and oblivious transfer protocols in the streaming BSM, where the adversary\u27s storage can be significantly larger, and even exponential . The only price of accommodating larger values of is that the round and communication complexities of our protocols grow accordingly, and we provide lower bounds to show that an increase in rounds and communication is necessary.
As an added benefit of our work, we also show that our oblivious transfer (OT) protocol in the BSM satisfies a simulation-based notion of security. In contrast, even for the restricted case of , prior solutions only satisfied a weaker indistinguishability based definition. As an application of our OT protocol, we get general multiparty computation (MPC) in the BSM that allows for up to exponentially large gaps between and , while also achieving simulation-based security
Authentication in the Bounded Storage Model
We consider the streaming variant of the Bounded Storage Model (BSM), where the honest parties can stream large amounts of data to each other, while only maintaining a small memory of size . The adversary also operates as a streaming algorithm, but has a much larger memory size . The goal is to construct unconditionally secure cryptographic schemes in the BSM, and prior works did so for symmetric-key encryption, key agreement, oblivious transfer and multiparty computation. In this work, we construct message authentication and signatures in the BSM.
First, we consider the symmetric-key setting, where Alice and Bob share a small secret key. Alice can authenticate arbitrarily many messages to Bob by streaming long authentication tags of size , while ensuring that the tags can be generated and verified using only bits of memory. We show a solution using local extractors (Vadhan; JoC \u2704), which allows for up to exponentially large adversarial memory , and has tags of size .
Second, we consider the same setting as above, but now additionally require each individual tag to be small, of size . We show a solution is still possible when the adversary\u27s memory is , which is optimal. Our solution relies on a space lower bound for leaning parities (Raz; FOCS \u2716).
Third, we consider the public-key signature setting. A signer Alice initially streams a long verification key over an authentic channel, while only keeping a short signing key in her memory. A verifier Bob receives the streamed verification key and generates some short verification digest that he keeps in his memory. Later, Alice can sign arbitrarily many messages using her signing key by streaming the signatures to Bob, who can verify them using his verification digest. We show a solution for , which we show to be optimal. Our solution relies on a novel entropy lemma, of independent interest. We show that, if a sequence of blocks has sufficiently high min-entropy, then a large fraction of individual blocks must have high min-entropy. Naive versions of this lemma are false, but we show how to patch it to make it hold
Rate-1 Incompressible Encryption from Standard Assumptions
Incompressible encryption, recently proposed by Guan, Wichs and Zhandry (EUROCRYPT\u2722), is a novel encryption paradigm geared towards providing strong long-term security guarantees against adversaries with bounded long-term memory. Given that the adversary forgets just a small fraction of a ciphertext, this notion provides strong security for the message encrypted therein, even if, at some point in the future, the entire secret key is exposed. This comes at the price of having potentially very large ciphertexts. Thus, an important efficiency measure for incompressible encryption is the message-to-ciphertext ratio (also called the rate). Guan et al. provided a low-rate instantiation of this notion from standard assumptions and a rate-1 instantiation from indistinguishability obfuscation (iO).
In this work, we propose a simple framework to build rate-1 incompressible encryption from standard assumptions. Our construction can be realized from, e.g. the DDH and additionally the DCR or the LWE assumptions
Flexible Long-Term Secure Archiving
Privacy and data protection have always been basic human needs in any society that makes use of written language. From simple personal correspondence over military communication to trade secrets or medical information, confidentiality has been of utmost importance. The implications of a leak of such sensitive information may prove devastating, as the previous examples illustrate perfectly. Furthermore reliability, that is, integrity and authenticitiy of information, is critical with risks reaching from annoying to lethal as can again be seen in the previous examples.
This need for data protection has carried over from the analogue to the digital age seamlessly with the amount of data being generated, transmitted and stored increasing steadily and containing more and more personal details. And in regard of the developments in computational technology that recent years have seen, such as the ongoing improvements with respect to quantum computing as well as cryptoanalytical advances, the capabilities of attackers on the security of private information have never been more distinct. Thus the need for privacy and data protection has rarely been more dire