580 research outputs found

    A concentration inequality for the overlap of a vector on a large set, with application to the communication complexity of the Gap-Hamming-Distance problem

    Get PDF
    Given two sets A, B ⊆ R_n, a measure of their correlation is given by the expected squared inner product between random x ϵ A and y ϵ B. We prove an inequality showing that no two sets of large enough Gaussian measure (at least e^(-δn) for some constant δ > 0) can have correlation substantially lower than would two random sets of the same size. Our proof is based on a concentration inequality for the overlap of a random Gaussian vector on a large set. As an application, we show how our result can be combined with the partition bound of Jain and Klauck to give a simpler proof of a recent linear lower bound on the randomized communication complexity of the Gap-Hamming-Distance problem due to Chakrabarti and Regev

    An Optimal Lower Bound on the Communication Complexity of Gap-Hamming-Distance

    Get PDF
    We prove an optimal Ω(n)\Omega(n) lower bound on the randomized communication complexity of the much-studied Gap-Hamming-Distance problem. As a consequence, we obtain essentially optimal multi-pass space lower bounds in the data stream model for a number of fundamental problems, including the estimation of frequency moments. The Gap-Hamming-Distance problem is a communication problem, wherein Alice and Bob receive nn-bit strings xx and yy, respectively. They are promised that the Hamming distance between xx and yy is either at least n/2+nn/2+\sqrt{n} or at most n/2nn/2-\sqrt{n}, and their goal is to decide which of these is the case. Since the formal presentation of the problem by Indyk and Woodruff (FOCS, 2003), it had been conjectured that the naive protocol, which uses nn bits of communication, is asymptotically optimal. The conjecture was shown to be true in several special cases, e.g., when the communication is deterministic, or when the number of rounds of communication is limited. The proof of our aforementioned result, which settles this conjecture fully, is based on a new geometric statement regarding correlations in Gaussian space, related to a result of C. Borell (1985). To prove this geometric statement, we show that random projections of not-too-small sets in Gaussian space are close to a mixture of translated normal variables

    Efficient quantum protocols for XOR functions

    Full text link
    We show that for any Boolean function f on {0,1}^n, the bounded-error quantum communication complexity of XOR functions ff\circ \oplus satisfies that Qϵ(f)=O(2d(logf^1,ϵ+lognϵ)log(1/ϵ))Q_\epsilon(f\circ \oplus) = O(2^d (\log\|\hat f\|_{1,\epsilon} + \log \frac{n}{\epsilon}) \log(1/\epsilon)), where d is the F2-degree of f, and f^1,ϵ=ming:fgϵf^1\|\hat f\|_{1,\epsilon} = \min_{g:\|f-g\|_\infty \leq \epsilon} \|\hat f\|_1. This implies that the previous lower bound Qϵ(f)=Ω(logf^1,ϵ)Q_\epsilon(f\circ \oplus) = \Omega(\log\|\hat f\|_{1,\epsilon}) by Lee and Shraibman \cite{LS09} is tight for f with low F2-degree. The result also confirms the quantum version of the Log-rank Conjecture for low-degree XOR functions. In addition, we show that the exact quantum communication complexity satisfies QE(f)=O(2dlogf^0)Q_E(f) = O(2^d \log \|\hat f\|_0), where f^0\|\hat f\|_0 is the number of nonzero Fourier coefficients of f. This matches the previous lower bound QE(f(x,y))=Ω(logrank(Mf))Q_E(f(x,y)) = \Omega(\log rank(M_f)) by Buhrman and de Wolf \cite{BdW01} for low-degree XOR functions.Comment: 11 pages, no figur

    Quantum states cannot be transmitted efficiently classically

    Get PDF
    We show that any classical two-way communication protocol with shared randomness that can approximately simulate the result of applying an arbitrary measurement (held by one party) to a quantum state of nn qubits (held by another), up to constant accuracy, must transmit at least Ω(2n)\Omega(2^n) bits. This lower bound is optimal and matches the complexity of a simple protocol based on discretisation using an ϵ\epsilon-net. The proof is based on a lower bound on the classical communication complexity of a distributed variant of the Fourier sampling problem. We obtain two optimal quantum-classical separations as easy corollaries. First, a sampling problem which can be solved with one quantum query to the input, but which requires Ω(N)\Omega(N) classical queries for an input of size NN. Second, a nonlocal task which can be solved using nn Bell pairs, but for which any approximate classical solution must communicate Ω(2n)\Omega(2^n) bits.Comment: 24 pages; v3: accepted version incorporating many minor corrections and clarification

    Information Complexity versus Corruption and Applications to Orthogonality and Gap-Hamming

    Full text link
    Three decades of research in communication complexity have led to the invention of a number of techniques to lower bound randomized communication complexity. The majority of these techniques involve properties of large submatrices (rectangles) of the truth-table matrix defining a communication problem. The only technique that does not quite fit is information complexity, which has been investigated over the last decade. Here, we connect information complexity to one of the most powerful "rectangular" techniques: the recently-introduced smooth corruption (or "smooth rectangle") bound. We show that the former subsumes the latter under rectangular input distributions. We conjecture that this subsumption holds more generally, under arbitrary distributions, which would resolve the long-standing direct sum question for randomized communication. As an application, we obtain an optimal Ω(n)\Omega(n) lower bound on the information complexity---under the {\em uniform distribution}---of the so-called orthogonality problem (ORT), which is in turn closely related to the much-studied Gap-Hamming-Distance (GHD). The proof of this bound is along the lines of recent communication lower bounds for GHD, but we encounter a surprising amount of additional technical detail

    Communication Complexity of Statistical Distance

    Get PDF
    We prove nearly matching upper and lower bounds on the randomized communication complexity of the following problem: Alice and Bob are each given a probability distribution over nn elements, and they wish to estimate within +-epsilon the statistical (total variation) distance between their distributions. For some range of parameters, there is up to a log(n) factor gap between the upper and lower bounds, and we identify a barrier to using information complexity techniques to improve the lower bound in this case. We also prove a side result that we discovered along the way: the randomized communication complexity of n-bit Majority composed with n-bit Greater-Than is Theta(n log n)

    Query-to-Communication Lifting for BPP

    Full text link
    For any nn-bit boolean function ff, we show that the randomized communication complexity of the composed function fgnf\circ g^n, where gg is an index gadget, is characterized by the randomized decision tree complexity of ff. In particular, this means that many query complexity separations involving randomized models (e.g., classical vs. quantum) automatically imply analogous separations in communication complexity.Comment: 21 page

    One-Sided Error Communication Complexity of Gap Hamming Distance

    Get PDF
    Assume that Alice has a binary string x and Bob a binary string y, both strings are of length n. Their goal is to output 0, if x and y are at least L-close in Hamming distance, and output 1, if x and y are at least U-far in Hamming distance, where L < U are some integer parameters known to both parties. If the Hamming distance between x and y lies in the interval (L, U), they are allowed to output anything. This problem is called the Gap Hamming Distance. In this paper we study public-coin one-sided error communication complexity of this problem. The error with probability at most 1/2 is allowed only for pairs at Hamming distance at least U. In this paper we determine this complexity up to factors logarithmic in L. The protocol we construct for the upper bound is simultaneous
    corecore