8 research outputs found

    Some Lower Bounds in Dynamic Networks with Oblivious Adversaries

    Get PDF
    This paper considers several closely-related problems in synchronous dynamic networks with oblivious adversaries, and proves novel Omega(d + poly(m)) lower bounds on their time complexity (in rounds). Here d is the dynamic diameter of the dynamic network and m is the total number of nodes. Before this work, the only known lower bounds on these problems under oblivious adversaries were the trivial Omega(d) lower bounds. Our novel lower bounds are hence the first non-trivial lower bounds and also the first lower bounds with a poly(m) term. Our proof relies on a novel reduction from a certain two-party communication complexity problem. Our central proof technique is unique in the sense that we consider the communication complexity with a special leaker. The leaker helps Alice and Bob in the two-party problem, by disclosing to Alice and Bob certain "non-critical" information about the problem instance that they are solving

    Communication Complexity of Permutation-Invariant Functions

    Full text link
    Motivated by the quest for a broader understanding of communication complexity of simple functions, we introduce the class of "permutation-invariant" functions. A partial function f:{0,1}n×{0,1}n{0,1,?}f:\{0,1\}^n \times \{0,1\}^n\to \{0,1,?\} is permutation-invariant if for every bijection π:{1,,n}{1,,n}\pi:\{1,\ldots,n\} \to \{1,\ldots,n\} and every x,y{0,1}n\mathbf{x}, \mathbf{y} \in \{0,1\}^n, it is the case that f(x,y)=f(xπ,yπ)f(\mathbf{x}, \mathbf{y}) = f(\mathbf{x}^{\pi}, \mathbf{y}^{\pi}). Most of the commonly studied functions in communication complexity are permutation-invariant. For such functions, we present a simple complexity measure (computable in time polynomial in nn given an implicit description of ff) that describes their communication complexity up to polynomial factors and up to an additive error that is logarithmic in the input size. This gives a coarse taxonomy of the communication complexity of simple functions. Our work highlights the role of the well-known lower bounds of functions such as 'Set-Disjointness' and 'Indexing', while complementing them with the relatively lesser-known upper bounds for 'Gap-Inner-Product' (from the sketching literature) and 'Sparse-Gap-Inner-Product' (from the recent work of Canonne et al. [ITCS 2015]). We also present consequences to the study of communication complexity with imperfectly shared randomness where we show that for total permutation-invariant functions, imperfectly shared randomness results in only a polynomial blow-up in communication complexity after an additive O(loglogn)O(\log \log n) overhead

    New Separations Results for External Information

    Full text link
    We obtain new separation results for the two-party external information complexity of boolean functions. The external information complexity of a function f(x,y)f(x,y) is the minimum amount of information a two-party protocol computing ff must reveal to an outside observer about the input. We obtain the following results: 1. We prove an exponential separation between external and internal information complexity, which is the best possible; previously no separation was known. 2. We prove a near-quadratic separation between amortized zero-error communication complexity and external information complexity for total functions, disproving a conjecture of \cite{Bravermansurvey}. 3. We prove a matching upper showing that our separation result is tight
    corecore