Skip to main content
Article thumbnail
Location of Repository

Lower bounds for the noisy broadcast problem

By Navin Goyal and Michael Saks Guy Kindler


We prove the first non-trivial (super linear) lower bound in the noisy broadcast model, defined by El Gamal in [6]. In this model there are n + 1 processors P0, P1,..., Pn, each of which is initially given a private input bit xi. The goal is for P0 to learn the value of f (x1,..., xn), for some specified function f, using a series of noisy broadcasts. At each step a designated processor broadcasts one bit to all of the other processors, and the bit received by each processor is flipped with fixed probability (independently for each recipient). In 1988, Gallager [16] gave a noise-resistant protocol that allows P0 to learn the entire input with constant probability in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal, up to a constant factor. Our lower bound follows by reduction from a lower bound for generalized noisy decision trees, a new model which may be of independent interest. For this new model we show a lower bound of \Omega (n log n) on the depth of a tree that learns the entire input. We als

Year: 2006
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.