Connectivity Lower Bounds in Broadcast Congested Clique

Abstract

We prove three new lower bounds for graph connectivity in the 1-bit broadcast congested clique model, BCC(1). First, in the KT-0 version of BCC(1), in which nodes are aware of neighbors only through port numbers, we show an ?(log n) round lower bound for Connectivity even for constant-error randomized Monte Carlo algorithms. The deterministic version of this result can be obtained via the well-known "edge-crossing" argument, but, the randomized version of this result requires establishing new combinatorial results regarding the indistinguishability graph induced by inputs. In our second result, we show that the ?(log n) lower bound result extends to the KT-1 version of the BCC(1) model, in which nodes are aware of IDs of all neighbors, though our proof works only for deterministic algorithms. This result substantially improves upon the existing ?(log^* n) deterministic lower bound (Jurdzi?ski et el., SIROCCO 2018) for this problem. Since nodes know IDs of their neighbors in the KT-1 model, it is no longer possible to play "edge-crossing" tricks; instead we present a reduction from the 2-party communication complexity problem Partition in which Alice and Bob are given two set partitions on [n] and are required to determine if the join of these two set partitions equals the trivial one-part set partition. While our KT-1 Connectivity lower bound holds only for deterministic algorithms, in our third result we extend this ?(log n) KT-1 lower bound to constant-error Monte Carlo algorithms for the closely related ConnectedComponents problem. We use information-theoretic techniques to obtain this result. All our results hold for the seemingly easy special case of Connectivity in which an algorithm has to distinguish an instance with one cycle from an instance with multiple cycles. Our results showcase three rather different lower bound techniques and lay the groundwork for further improvements in lower bounds for Connectivity in the BCC(1) model

    Similar works