4,696 research outputs found
On the Communication Complexity of Secure Computation
Information theoretically secure multi-party computation (MPC) is a central
primitive of modern cryptography. However, relatively little is known about the
communication complexity of this primitive.
In this work, we develop powerful information theoretic tools to prove lower
bounds on the communication complexity of MPC. We restrict ourselves to a
3-party setting in order to bring out the power of these tools without
introducing too many complications. Our techniques include the use of a data
processing inequality for residual information - i.e., the gap between mutual
information and G\'acs-K\"orner common information, a new information
inequality for 3-party protocols, and the idea of distribution switching by
which lower bounds computed under certain worst-case scenarios can be shown to
apply for the general case.
Using these techniques we obtain tight bounds on communication complexity by
MPC protocols for various interesting functions. In particular, we show
concrete functions that have "communication-ideal" protocols, which achieve the
minimum communication simultaneously on all links in the network. Also, we
obtain the first explicit example of a function that incurs a higher
communication cost than the input length in the secure computation model of
Feige, Kilian and Naor (1994), who had shown that such functions exist. We also
show that our communication bounds imply tight lower bounds on the amount of
randomness required by MPC protocols for many interesting functions.Comment: 37 page
Exponential Lower Bound for 2-Query Locally Decodable Codes via a Quantum Argument
A locally decodable code encodes n-bit strings x in m-bit codewords C(x), in
such a way that one can recover any bit x_i from a corrupted codeword by
querying only a few bits of that word. We use a quantum argument to prove that
LDCs with 2 classical queries need exponential length: m=2^{Omega(n)}.
Previously this was known only for linear codes (Goldreich et al. 02). Our
proof shows that a 2-query LDC can be decoded with only 1 quantum query, and
then proves an exponential lower bound for such 1-query locally
quantum-decodable codes. We also show that q quantum queries allow more
succinct LDCs than the best known LDCs with q classical queries. Finally, we
give new classical lower bounds and quantum upper bounds for the setting of
private information retrieval. In particular, we exhibit a quantum 2-server PIR
scheme with O(n^{3/10}) qubits of communication, improving upon the O(n^{1/3})
bits of communication of the best known classical 2-server PIR.Comment: 16 pages Latex. 2nd version: title changed, large parts rewritten,
some results added or improve
On the Combinatorial Version of the Slepian-Wolf Problem
We study the following combinatorial version of the Slepian-Wolf coding
scheme. Two isolated Senders are given binary strings and respectively;
the length of each string is equal to , and the Hamming distance between the
strings is at most . The Senders compress their strings and
communicate the results to the Receiver. Then the Receiver must reconstruct
both strings and . The aim is to minimise the lengths of the transmitted
messages.
For an asymmetric variant of this problem (where one of the Senders transmits
the input string to the Receiver without compression) with deterministic
encoding a nontrivial lower bound was found by A.Orlitsky and K.Viswanathany.
In our paper we prove a new lower bound for the schemes with syndrome coding,
where at least one of the Senders uses linear encoding of the input string.
For the combinatorial Slepian-Wolf problem with randomized encoding the
theoretical optimum of communication complexity was recently found by the first
author, though effective protocols with optimal lengths of messages remained
unknown. We close this gap and present a polynomial time randomized protocol
that achieves the optimal communication complexity.Comment: 20 pages, 14 figures. Accepted to IEEE Transactions on Information
Theory (June 2018
Improved Lower Bounds for Locally Decodable Codes and Private Information Retrieval
We prove new lower bounds for locally decodable codes and private information
retrieval. We show that a 2-query LDC encoding n-bit strings over an l-bit
alphabet, where the decoder only uses b bits of each queried position of the
codeword, needs code length m = exp(Omega(n/(2^b Sum_{i=0}^b {l choose i})))
Similarly, a 2-server PIR scheme with an n-bit database and t-bit queries,
where the user only needs b bits from each of the two l-bit answers, unknown to
the servers, satisfies t = Omega(n/(2^b Sum_{i=0}^b {l choose i})). This
implies that several known PIR schemes are close to optimal. Our results
generalize those of Goldreich et al. who proved roughly the same bounds for
linear LDCs and PIRs. Like earlier work by Kerenidis and de Wolf, our classical
lower bounds are proved using quantum computational techniques. In particular,
we give a tight analysis of how well a 2-input function can be computed from a
quantum superposition of both inputs.Comment: 12 pages LaTeX, To appear in ICALP '0
An Improved Interactive Streaming Algorithm for the Distinct Elements Problem
The exact computation of the number of distinct elements (frequency moment
) is a fundamental problem in the study of data streaming algorithms. We
denote the length of the stream by where each symbol is drawn from a
universe of size . While it is well known that the moments can
be approximated by efficient streaming algorithms, it is easy to see that exact
computation of requires space . In previous work, Cormode
et al. therefore considered a model where the data stream is also processed by
a powerful helper, who provides an interactive proof of the result. They gave
such protocols with a polylogarithmic number of rounds of communication between
helper and verifier for all functions in NC. This number of rounds
can quickly make such
protocols impractical.
Cormode et al. also gave a protocol with rounds for the exact
computation of where the space complexity is but the total communication . They managed to give round protocols with
complexity for many other interesting problems
including , Inner product, and Range-sum, but computing exactly with
polylogarithmic space and communication and rounds remained open.
In this work, we give a streaming interactive protocol with rounds
for exact computation of using bits of space and the communication is . The update
time of the verifier per symbol received is .Comment: Submitted to ICALP 201
Optimal Error Rates for Interactive Coding I: Adaptivity and Other Settings
We consider the task of interactive communication in the presence of
adversarial errors and present tight bounds on the tolerable error-rates in a
number of different settings.
Most significantly, we explore adaptive interactive communication where the
communicating parties decide who should speak next based on the history of the
interaction. Braverman and Rao [STOC'11] show that non-adaptively one can code
for any constant error rate below 1/4 but not more. They asked whether this
bound could be improved using adaptivity. We answer this open question in the
affirmative (with a slightly different collection of resources): Our adaptive
coding scheme tolerates any error rate below 2/7 and we show that tolerating a
higher error rate is impossible. We also show that in the setting of Franklin
et al. [CRYPTO'13], where parties share randomness not known to the adversary,
adaptivity increases the tolerable error rate from 1/2 to 2/3. For
list-decodable interactive communications, where each party outputs a constant
size list of possible outcomes, the tight tolerable error rate is 1/2.
Our negative results hold even if the communication and computation are
unbounded, whereas for our positive results communication and computation are
polynomially bounded. Most prior work considered coding schemes with linear
amount of communication, while allowing unbounded computations. We argue that
studying tolerable error rates in this relaxed context helps to identify a
setting's intrinsic optimal error rate. We set forward a strong working
hypothesis which stipulates that for any setting the maximum tolerable error
rate is independent of many computational and communication complexity
measures. We believe this hypothesis to be a powerful guideline for the design
of simple, natural, and efficient coding schemes and for understanding the
(im)possibilities of coding for interactive communications
- …