We study a problem related to coin flipping, coding theory, and noise
sensitivity. Consider a source of truly random bits x \in \bits^n, and k
parties, who have noisy versions of the source bits y^i \in \bits^n, where
for all i and j, it holds that \Pr[y^i_j = x_j] = 1 - \eps, independently
for all i and j. That is, each party sees each bit correctly with
probability 1βΟ΅, and incorrectly (flipped) with probability
Ο΅, independently for all bits and all parties. The parties, who cannot
communicate, wish to agree beforehand on {\em balanced} functions f_i :
\bits^n \to \bits such that Pr[f1β(y1)=...=fkβ(yk)] is maximized. In
other words, each party wants to toss a fair coin so that the probability that
all parties have the same coin is maximized. The functions fiβ may be thought
of as an error correcting procedure for the source x.
When k=2,3 no error correction is possible, as the optimal protocol is
given by fiβ(xi)=y1iβ. On the other hand, for large values of k, better
protocols exist. We study general properties of the optimal protocols and the
asymptotic behavior of the problem with respect to k, n and \eps. Our
analysis uses tools from probability, discrete Fourier analysis, convexity and
discrete symmetrization