research

Correlation in Hard Distributions in Communication Complexity

Abstract

We study the effect that the amount of correlation in a bipartite distribution has on the communication complexity of a problem under that distribution. We introduce a new family of complexity measures that interpolates between the two previously studied extreme cases: the (standard) randomised communication complexity and the case of distributional complexity under product distributions. We give a tight characterisation of the randomised complexity of Disjointness under distributions with mutual information kk, showing that it is Θ(n(k+1))\Theta(\sqrt{n(k+1)}) for all 0kn0\leq k\leq n. This smoothly interpolates between the lower bounds of Babai, Frankl and Simon for the product distribution case (k=0k=0), and the bound of Razborov for the randomised case. The upper bounds improve and generalise what was known for product distributions, and imply that any tight bound for Disjointness needs Ω(n)\Omega(n) bits of mutual information in the corresponding distribution. We study the same question in the distributional quantum setting, and show a lower bound of Ω((n(k+1))1/4)\Omega((n(k+1))^{1/4}), and an upper bound, matching up to a logarithmic factor. We show that there are total Boolean functions fdf_d on 2n2n inputs that have distributional communication complexity O(logn)O(\log n) under all distributions of information up to o(n)o(n), while the (interactive) distributional complexity maximised over all distributions is Θ(logd)\Theta(\log d) for 6nd2n/1006n\leq d\leq 2^{n/100}. We show that in the setting of one-way communication under product distributions, the dependence of communication cost on the allowed error ϵ\epsilon is multiplicative in log(1/ϵ)\log(1/\epsilon) -- the previous upper bounds had the dependence of more than 1/ϵ1/\epsilon

    Similar works