1,865 research outputs found

    An Improved Distributed Algorithm for Maximal Independent Set

    Full text link
    The Maximal Independent Set (MIS) problem is one of the basics in the study of locality in distributed graph algorithms. This paper presents an extremely simple randomized algorithm providing a near-optimal local complexity for this problem, which incidentally, when combined with some recent techniques, also leads to a near-optimal global complexity. Classical algorithms of Luby [STOC'85] and Alon, Babai and Itai [JALG'86] provide the global complexity guarantee that, with high probability, all nodes terminate after O(logn)O(\log n) rounds. In contrast, our initial focus is on the local complexity, and our main contribution is to provide a very simple algorithm guaranteeing that each particular node vv terminates after O(logdeg(v)+log1/ϵ)O(\log \mathsf{deg}(v)+\log 1/\epsilon) rounds, with probability at least 1ϵ1-\epsilon. The guarantee holds even if the randomness outside 22-hops neighborhood of vv is determined adversarially. This degree-dependency is optimal, due to a lower bound of Kuhn, Moscibroda, and Wattenhofer [PODC'04]. Interestingly, this local complexity smoothly transitions to a global complexity: by adding techniques of Barenboim, Elkin, Pettie, and Schneider [FOCS'12, arXiv: 1202.1983v3], we get a randomized MIS algorithm with a high probability global complexity of O(logΔ)+2O(loglogn)O(\log \Delta) + 2^{O(\sqrt{\log \log n})}, where Δ\Delta denotes the maximum degree. This improves over the O(log2Δ)+2O(loglogn)O(\log^2 \Delta) + 2^{O(\sqrt{\log \log n})} result of Barenboim et al., and gets close to the Ω(min{logΔ,logn})\Omega(\min\{\log \Delta, \sqrt{\log n}\}) lower bound of Kuhn et al. Corollaries include improved algorithms for MIS in graphs of upper-bounded arboricity, or lower-bounded girth, for Ruling Sets, for MIS in the Local Computation Algorithms (LCA) model, and a faster distributed algorithm for the Lov\'asz Local Lemma

    On Bioelectric Algorithms

    Get PDF
    Cellular bioelectricity describes the biological phenomenon in which cells in living tissue generate and maintain patterns of voltage gradients across their membranes induced by differing concentrations of charged ions. A growing body of research suggests that bioelectric patterns represent an ancient system that plays a key role in guiding many important developmental processes including tissue regeneration, tumor suppression, and embryogenesis. This paper applies techniques from distributed algorithm theory to help better understand how cells work together to form these patterns. To do so, we present the cellular bioelectric model (CBM), a new computational model that captures the primary capabilities and constraints of bioelectric interactions between cells and their environment. We use this model to investigate several important topics from the relevant biology research literature. We begin with symmetry breaking, analyzing a simple cell definition that when combined in single hop or multihop topologies, efficiently solves leader election and the maximal independent set problem, respectively - indicating that these classical symmetry breaking tasks are well-matched to bioelectric mechanisms. We then turn our attention to the information processing ability of bioelectric cells, exploring upper and lower bounds for approximate solutions to threshold and majority detection, and then proving that these systems are in fact Turing complete - resolving an open question about the computational power of bioelectric interactions

    Towards a complexity theory for the congested clique

    Full text link
    The congested clique model of distributed computing has been receiving attention as a model for densely connected distributed systems. While there has been significant progress on the side of upper bounds, we have very little in terms of lower bounds for the congested clique; indeed, it is now know that proving explicit congested clique lower bounds is as difficult as proving circuit lower bounds. In this work, we use various more traditional complexity-theoretic tools to build a clearer picture of the complexity landscape of the congested clique: -- Nondeterminism and beyond: We introduce the nondeterministic congested clique model (analogous to NP) and show that there is a natural canonical problem family that captures all problems solvable in constant time with nondeterministic algorithms. We further generalise these notions by introducing the constant-round decision hierarchy (analogous to the polynomial hierarchy). -- Non-constructive lower bounds: We lift the prior non-uniform counting arguments to a general technique for proving non-constructive uniform lower bounds for the congested clique. In particular, we prove a time hierarchy theorem for the congested clique, showing that there are decision problems of essentially all complexities, both in the deterministic and nondeterministic settings. -- Fine-grained complexity: We map out relationships between various natural problems in the congested clique model, arguing that a reduction-based complexity theory currently gives us a fairly good picture of the complexity landscape of the congested clique
    corecore