A basic problem in information theory is the following: Let P=(X,Y) be an arbitrary distribution where the marginals
X and Y are (potentially) correlated. Let Alice and Bob
be two players where Alice gets samples {xi}i≥1 and Bob gets
samples {yi}i≥1 and for all i, (xi,yi)∼P. What
joint distributions Q can be simulated by Alice and Bob without any
interaction?
Classical works in information theory by G{\'a}cs-K{\"o}rner and Wyner answer
this question when at least one of P or Q is the
distribution on {0,1}×{0,1} where each marginal is unbiased and
identical. However, other than this special case, the answer to this question
is understood in very few cases. Recently, Ghazi, Kamath and Sudan showed that
this problem is decidable for Q supported on {0,1}×{0,1}. We extend their result to Q supported on any finite
alphabet.
We rely on recent results in Gaussian geometry (by the authors) as well as a
new \emph{smoothing argument} inspired by the method of \emph{boosting} from
learning theory and potential function arguments from complexity theory and
additive combinatorics.Comment: The reduction for non-interactive simulation for general source
distribution to the Gaussian case was incorrect in the previous version. It
has been rectified no