580 research outputs found
A concentration inequality for the overlap of a vector on a large set, with application to the communication complexity of the Gap-Hamming-Distance problem
Given two sets A, B ⊆ R_n, a measure of their correlation is given by the expected squared inner product between random x ϵ A and y ϵ B. We prove an inequality showing that no two sets of large enough Gaussian measure (at least e^(-δn) for some constant δ > 0) can have correlation substantially lower than would two random sets of the same size. Our proof is based on a concentration inequality for the overlap of a random Gaussian vector on a large set.
As an application, we show how our result can be combined with the partition bound of Jain and Klauck to give a simpler proof of a recent linear lower bound on the randomized communication complexity of the Gap-Hamming-Distance problem due to Chakrabarti and Regev
An Optimal Lower Bound on the Communication Complexity of Gap-Hamming-Distance
We prove an optimal lower bound on the randomized communication
complexity of the much-studied Gap-Hamming-Distance problem. As a consequence,
we obtain essentially optimal multi-pass space lower bounds in the data stream
model for a number of fundamental problems, including the estimation of
frequency moments.
The Gap-Hamming-Distance problem is a communication problem, wherein Alice
and Bob receive -bit strings and , respectively. They are promised
that the Hamming distance between and is either at least
or at most , and their goal is to decide which of these is the
case. Since the formal presentation of the problem by Indyk and Woodruff (FOCS,
2003), it had been conjectured that the naive protocol, which uses bits of
communication, is asymptotically optimal. The conjecture was shown to be true
in several special cases, e.g., when the communication is deterministic, or
when the number of rounds of communication is limited.
The proof of our aforementioned result, which settles this conjecture fully,
is based on a new geometric statement regarding correlations in Gaussian space,
related to a result of C. Borell (1985). To prove this geometric statement, we
show that random projections of not-too-small sets in Gaussian space are close
to a mixture of translated normal variables
Efficient quantum protocols for XOR functions
We show that for any Boolean function f on {0,1}^n, the bounded-error quantum
communication complexity of XOR functions satisfies that
, where d is the F2-degree of f, and
.
This implies that the previous lower bound by Lee and Shraibman \cite{LS09} is tight
for f with low F2-degree. The result also confirms the quantum version of the
Log-rank Conjecture for low-degree XOR functions. In addition, we show that the
exact quantum communication complexity satisfies , where is the number of nonzero Fourier coefficients of
f. This matches the previous lower bound
by Buhrman and de Wolf \cite{BdW01} for low-degree XOR functions.Comment: 11 pages, no figur
Quantum states cannot be transmitted efficiently classically
We show that any classical two-way communication protocol with shared
randomness that can approximately simulate the result of applying an arbitrary
measurement (held by one party) to a quantum state of qubits (held by
another), up to constant accuracy, must transmit at least bits.
This lower bound is optimal and matches the complexity of a simple protocol
based on discretisation using an -net. The proof is based on a lower
bound on the classical communication complexity of a distributed variant of the
Fourier sampling problem. We obtain two optimal quantum-classical separations
as easy corollaries. First, a sampling problem which can be solved with one
quantum query to the input, but which requires classical queries
for an input of size . Second, a nonlocal task which can be solved using
Bell pairs, but for which any approximate classical solution must communicate
bits.Comment: 24 pages; v3: accepted version incorporating many minor corrections
and clarification
Information Complexity versus Corruption and Applications to Orthogonality and Gap-Hamming
Three decades of research in communication complexity have led to the
invention of a number of techniques to lower bound randomized communication
complexity. The majority of these techniques involve properties of large
submatrices (rectangles) of the truth-table matrix defining a communication
problem. The only technique that does not quite fit is information complexity,
which has been investigated over the last decade. Here, we connect information
complexity to one of the most powerful "rectangular" techniques: the
recently-introduced smooth corruption (or "smooth rectangle") bound. We show
that the former subsumes the latter under rectangular input distributions. We
conjecture that this subsumption holds more generally, under arbitrary
distributions, which would resolve the long-standing direct sum question for
randomized communication. As an application, we obtain an optimal
lower bound on the information complexity---under the {\em uniform
distribution}---of the so-called orthogonality problem (ORT), which is in turn
closely related to the much-studied Gap-Hamming-Distance (GHD). The proof of
this bound is along the lines of recent communication lower bounds for GHD, but
we encounter a surprising amount of additional technical detail
Communication Complexity of Statistical Distance
We prove nearly matching upper and lower bounds on the randomized communication complexity of the following problem: Alice and Bob are each given a probability distribution over elements, and they wish to estimate within +-epsilon the statistical (total variation) distance between their distributions. For some range of parameters, there is up to a log(n) factor gap between the upper and lower bounds, and we identify a barrier to using information complexity techniques to improve the lower bound in this case. We also prove a side result that we discovered along the way: the randomized communication complexity of n-bit Majority composed with n-bit Greater-Than is Theta(n log n)
Query-to-Communication Lifting for BPP
For any -bit boolean function , we show that the randomized
communication complexity of the composed function , where is an
index gadget, is characterized by the randomized decision tree complexity of
. In particular, this means that many query complexity separations involving
randomized models (e.g., classical vs. quantum) automatically imply analogous
separations in communication complexity.Comment: 21 page
One-Sided Error Communication Complexity of Gap Hamming Distance
Assume that Alice has a binary string x and Bob a binary string y, both strings are of length n. Their goal is to output 0, if x and y are at least L-close in Hamming distance, and output 1, if x and y are at least U-far in Hamming distance, where L < U are some integer parameters known to both parties. If the Hamming distance between x and y lies in the interval (L, U), they are allowed to output anything. This problem is called the Gap Hamming Distance. In this paper we study public-coin one-sided error communication complexity of this problem. The error with probability at most 1/2 is allowed only for pairs at Hamming distance at least U. In this paper we determine this complexity up to factors logarithmic in L. The protocol we construct for the upper bound is simultaneous
- …