2,611 research outputs found
Tight Bounds for Set Disjointness in the Message Passing Model
In a multiparty message-passing model of communication, there are
players. Each player has a private input, and they communicate by sending
messages to one another over private channels. While this model has been used
extensively in distributed computing and in multiparty computation, lower
bounds on communication complexity in this model and related models have been
somewhat scarce. In recent work \cite{phillips12,woodruff12,woodruff13}, strong
lower bounds of the form were obtained for several
functions in the message-passing model; however, a lower bound on the classical
Set Disjointness problem remained elusive.
In this paper, we prove tight lower bounds of the form
for the Set Disjointness problem in the message passing model. Our bounds are
obtained by developing information complexity tools in the message-passing
model, and then proving an information complexity lower bound for Set
Disjointness. As a corollary, we show a tight lower bound for the task
allocation problem \cite{DruckerKuhnOshman} via a reduction from Set
Disjointness
Tribes Is Hard in the Message Passing Model
We consider the point-to-point message passing model of communication in
which there are processors with individual private inputs, each -bit
long. Each processor is located at the node of an underlying undirected graph
and has access to private random coins. An edge of the graph is a private
channel of communication between its endpoints. The processors have to compute
a given function of all their inputs by communicating along these channels.
While this model has been widely used in distributed computing, strong lower
bounds on the amount of communication needed to compute simple functions have
just begun to appear. In this work, we prove a tight lower bound of
on the communication needed for computing the Tribes function,
when the underlying graph is a star of nodes that has leaves with
inputs and a center with no input. Lower bound on this topology easily implies
comparable bounds for others. Our lower bounds are obtained by building upon
the recent information theoretic techniques of Braverman et.al (FOCS'13) and
combining it with the earlier work of Jayram, Kumar and Sivakumar (STOC'03).
This approach yields information complexity bounds that is of independent
interest
On the Communication Complexity of Secure Computation
Information theoretically secure multi-party computation (MPC) is a central
primitive of modern cryptography. However, relatively little is known about the
communication complexity of this primitive.
In this work, we develop powerful information theoretic tools to prove lower
bounds on the communication complexity of MPC. We restrict ourselves to a
3-party setting in order to bring out the power of these tools without
introducing too many complications. Our techniques include the use of a data
processing inequality for residual information - i.e., the gap between mutual
information and G\'acs-K\"orner common information, a new information
inequality for 3-party protocols, and the idea of distribution switching by
which lower bounds computed under certain worst-case scenarios can be shown to
apply for the general case.
Using these techniques we obtain tight bounds on communication complexity by
MPC protocols for various interesting functions. In particular, we show
concrete functions that have "communication-ideal" protocols, which achieve the
minimum communication simultaneously on all links in the network. Also, we
obtain the first explicit example of a function that incurs a higher
communication cost than the input length in the secure computation model of
Feige, Kilian and Naor (1994), who had shown that such functions exist. We also
show that our communication bounds imply tight lower bounds on the amount of
randomness required by MPC protocols for many interesting functions.Comment: 37 page
- …