1,194 research outputs found

    Dynamic Algorithms for the Massively Parallel Computation Model

    Get PDF
    The Massive Parallel Computing (MPC) model gained popularity during the last decade and it is now seen as the standard model for processing large scale data. One significant shortcoming of the model is that it assumes to work on static datasets while, in practice, real-world datasets evolve continuously. To overcome this issue, in this paper we initiate the study of dynamic algorithms in the MPC model. We first discuss the main requirements for a dynamic parallel model and we show how to adapt the classic MPC model to capture them. Then we analyze the connection between classic dynamic algorithms and dynamic algorithms in the MPC model. Finally, we provide new efficient dynamic MPC algorithms for a variety of fundamental graph problems, including connectivity, minimum spanning tree and matching.Comment: Accepted to the 31st ACM Symposium on Parallelism in Algorithms and Architectures (SPAA 2019

    Parameterized Streaming Algorithms for Vertex Cover

    Full text link
    As graphs continue to grow in size, we seek ways to effectively process such data at scale. The model of streaming graph processing, in which a compact summary is maintained as each edge insertion/deletion is observed, is an attractive one. However, few results are known for optimization problems over such dynamic graph streams. In this paper, we introduce a new approach to handling graph streams, by instead seeking solutions for the parameterized versions of these problems where we are given a parameter kk and the objective is to decide whether there is a solution bounded by kk. By combining kernelization techniques with randomized sketch structures, we obtain the first streaming algorithms for the parameterized versions of the Vertex Cover problem. We consider the following three models for a graph stream on nn nodes: 1. The insertion-only model where the edges can only be added. 2. The dynamic model where edges can be both inserted and deleted. 3. The \emph{promised} dynamic model where we are guaranteed that at each timestamp there is a solution of size at most kk. In each of these three models we are able to design parameterized streaming algorithms for the Vertex Cover problem. We are also able to show matching lower bound for the space complexity of our algorithms. (Due to the arXiv limit of 1920 characters for abstract field, please see the abstract in the paper for detailed description of our results)Comment: Fixed some typo

    Fully Dynamic Sequential and Distributed Algorithms for MAX-CUT

    Get PDF
    This paper initiates the study of the MAX-CUT problem in fully dynamic graphs. Given a graph G = (V,E), we present deterministic fully dynamic distributed and sequential algorithms to maintain a cut on G which always contains at least |E|/2 edges in sublinear update time under edge insertions and deletions to G. Our results include the following deterministic algorithms: i) an O(?) worst-case update time sequential algorithm, where ? denotes the maximum degree of G, ii) the first fully dynamic distributed algorithm taking O(1) rounds and O(?) total bits of communication per update in the Massively Parallel Computation (MPC) model with n machines and O(n) words of memory per machine. The aforementioned algorithms require at most one adjustment, that is, a move of one vertex from one side of the cut to the other. We also give the following fully dynamic sequential algorithms: i) a deterministic O(m^{1/2}) amortized update time algorithm where m denotes the maximum number of edges in G during any sequence of updates and, ii) a randomized algorithm which takes O?(n^{2/3}) worst-case update time when edge updates come from an oblivious adversary

    Dynamic Algorithms for Maximum Matching Size

    Full text link
    We study fully dynamic algorithms for maximum matching. This is a well-studied problem, known to admit several update-time/approximation trade-offs. For instance, it is known how to maintain a 1/2-approximate matching in (polylog n) time or a 2/32/3-approximate matching in O(n)O(\sqrt{n}) time, where nn is the number of vertices. Improving either of these bounds has been a long-standing open problem. In this paper, we show that when the goal is to maintain just the size of the matching instead of its edge-set, then these bounds can indeed be improved. We give algorithms that maintain * a .501-approximation in (polylog n) update-time for general graphs, * a .534-approximation in (polylog n) update-time for bipartite graphs, and * a (23+Ω(1))(\frac{2}{3} + \Omega(1))-approximation in O(n)O(\sqrt{n}) update-time for bipartite graphs

    When Algorithms for Maximal Independent Set and Maximal Matching Run in Sublinear Time

    Get PDF
    Maximal independent set (MIS), maximal matching (MM), and (Delta+1)-(vertex) coloring in graphs of maximum degree Delta are among the most prominent algorithmic graph theory problems. They are all solvable by a simple linear-time greedy algorithm and up until very recently this constituted the state-of-the-art. In SODA 2019, Assadi, Chen, and Khanna gave a randomized algorithm for (Delta+1)-coloring that runs in O~(n sqrt{n}) time, which even for moderately dense graphs is sublinear in the input size. The work of Assadi et al. however contained a spoiler for MIS and MM: neither problems provably admits a sublinear-time algorithm in general graphs. In this work, we dig deeper into the possibility of achieving sublinear-time algorithms for MIS and MM. The neighborhood independence number of a graph G, denoted by beta(G), is the size of the largest independent set in the neighborhood of any vertex. We identify beta(G) as the "right" parameter to measure the runtime of MIS and MM algorithms: Although graphs of bounded neighborhood independence may be very dense (clique is one example), we prove that carefully chosen variants of greedy algorithms for MIS and MM run in O(n beta(G)) and O(n log{n} * beta(G)) time respectively on any n-vertex graph G. We complement this positive result by observing that a simple extension of the lower bound of Assadi et al. implies that Omega(n beta(G)) time is also necessary for any algorithm to either problem for all values of beta(G) from 1 to Theta(n). We note that our algorithm for MIS is deterministic while for MM we use randomization which we prove is unavoidable: any deterministic algorithm for MM requires Omega(n^2) time even for beta(G) = 2. Graphs with bounded neighborhood independence, already for constant beta = beta(G), constitute a rich family of possibly dense graphs, including line graphs, proper interval graphs, unit-disk graphs, claw-free graphs, and graphs of bounded growth. Our results suggest that even though MIS and MM do not admit sublinear-time algorithms in general graphs, one can still solve both problems in sublinear time for a wide range of beta(G) << n. Finally, by observing that the lower bound of Omega(n sqrt{n}) time for (Delta+1)-coloring due to Assadi et al. applies to graphs of (small) constant neighborhood independence, we unveil an intriguing separation between the time complexity of MIS and MM, and that of (Delta+1)-coloring: while the time complexity of MIS and MM is strictly higher than that of (Delta+1) coloring in general graphs, the exact opposite relation holds for graphs with small neighborhood independence

    Worst-Case Efficient Dynamic Geometric Independent Set

    Get PDF
    We consider the problem of maintaining an approximate maximum independent set of geometric objects under insertions and deletions. We present a data structure that maintains a constant-factor approximate maximum independent set for broad classes of fat objects in d dimensions, where d is assumed to be a constant, in sublinear worst-case update time. This gives the first results for dynamic independent set in a wide variety of geometric settings, such as disks, fat polygons, and their high-dimensional equivalents. For axis-aligned squares and hypercubes, our result improves upon all (recently announced) previous works. We obtain, in particular, a dynamic (4+?)-approximation for squares, with O(log? n) worst-case update time. Our result is obtained via a two-level approach. First, we develop a dynamic data structure which stores all objects and provides an approximate independent set when queried, with output-sensitive running time. We show that via standard methods such a structure can be used to obtain a dynamic algorithm with amortized update time bounds. Then, to obtain worst-case update time algorithms, we develop a generic deamortization scheme that with each insertion/deletion keeps (i) the update time bounded and (ii) the number of changes in the independent set constant. We show that such a scheme is applicable to fat objects by showing an appropriate generalization of a separator theorem. Interestingly, we show that our deamortization scheme is also necessary in order to obtain worst-case update bounds: If for a class of objects our scheme is not applicable, then no constant-factor approximation with sublinear worst-case update time is possible. We show that such a lower bound applies even for seemingly simple classes of geometric objects including axis-aligned rectangles in the plane
    • …
    corecore