64 research outputs found

    Fully Dynamic Matching in Bipartite Graphs

    Full text link
    Maximum cardinality matching in bipartite graphs is an important and well-studied problem. The fully dynamic version, in which edges are inserted and deleted over time has also been the subject of much attention. Existing algorithms for dynamic matching (in general graphs) seem to fall into two groups: there are fast (mostly randomized) algorithms that do not achieve a better than 2-approximation, and there slow algorithms with \O(\sqrt{m}) update time that achieve a better-than-2 approximation. Thus the obvious question is whether we can design an algorithm -- deterministic or randomized -- that achieves a tradeoff between these two: a o(m)o(\sqrt{m}) approximation and a better-than-2 approximation simultaneously. We answer this question in the affirmative for bipartite graphs. Our main result is a fully dynamic algorithm that maintains a 3/2 + \eps approximation in worst-case update time O(m^{1/4}\eps^{/2.5}). We also give stronger results for graphs whose arboricity is at most \al, achieving a (1+ \eps) approximation in worst-case time O(\al (\al + \log n)) for constant \eps. When the arboricity is constant, this bound is O(logn)O(\log n) and when the arboricity is polylogarithmic the update time is also polylogarithmic. The most important technical developement is the use of an intermediate graph we call an edge degree constrained subgraph (EDCS). This graph places constraints on the sum of the degrees of the endpoints of each edge: upper bounds for matched edges and lower bounds for unmatched edges. The main technical content of our paper involves showing both how to maintain an EDCS dynamically and that and EDCS always contains a sufficiently large matching. We also make use of graph orientations to help bound the amount of work done during each update.Comment: Longer version of paper that appears in ICALP 201

    Arboricity, h-Index, and Dynamic Algorithms

    Get PDF
    In this paper we present a modification of a technique by Chiba and Nishizeki [Chiba and Nishizeki: Arboricity and Subgraph Listing Algorithms, SIAM J. Comput. 14(1), pp. 210--223 (1985)]. Based on it, we design a data structure suitable for dynamic graph algorithms. We employ the data structure to formulate new algorithms for several problems, including counting subgraphs of four vertices, recognition of diamond-free graphs, cop-win graphs and strongly chordal graphs, among others. We improve the time complexity for graphs with low arboricity or h-index.Comment: 19 pages, no figure

    Incremental Maintenance of Maximal Cliques in a Dynamic Graph

    Full text link
    We consider the maintenance of the set of all maximal cliques in a dynamic graph that is changing through the addition or deletion of edges. We present nearly tight bounds on the magnitude of change in the set of maximal cliques, as well as the first change-sensitive algorithms for clique maintenance, whose runtime is proportional to the magnitude of the change in the set of maximal cliques. We present experimental results showing these algorithms are efficient in practice and are faster than prior work by two to three orders of magnitude.Comment: 18 pages, 8 figure

    Fully Dynamic MIS in Uniformly Sparse Graphs

    Get PDF
    We consider the problem of maintaining a maximal independent set (MIS) in a dynamic graph subject to edge insertions and deletions. Recently, Assadi, Onak, Schieber and Solomon (STOC 2018) showed that an MIS can be maintained in sublinear (in the dynamically changing number of edges) amortized update time. In this paper we significantly improve the update time for uniformly sparse graphs. Specifically, for graphs with arboricity alpha, the amortized update time of our algorithm is O(alpha^2 * log^2 n), where n is the number of vertices. For low arboricity graphs, which include, for example, minor-free graphs as well as some classes of "real world" graphs, our update time is polylogarithmic. Our update time improves the result of Assadi et al. for all graphs with arboricity bounded by m^{3/8 - epsilon}, for any constant epsilon > 0. This covers much of the range of possible values for arboricity, as the arboricity of a general graph cannot exceed m^{1/2}

    Adaptive Out-Orientations with Applications

    Full text link
    We give simple algorithms for maintaining edge-orientations of a fully-dynamic graph, such that the out-degree of each vertex is bounded. On one hand, we show how to orient the edges such that the out-degree of each vertex is proportional to the arboricity α\alpha of the graph, in a worst-case update time of O(log2nlogα)O(\log^2 n \log \alpha). On the other hand, motivated by applications in dynamic maximal matching, we obtain a different trade-off, namely the improved worst case update time of O(lognlogα)O(\log n \log \alpha) for the problem of maintaining an edge-orientation with at most O(α+logn)O(\alpha + \log n) out-edges per vertex. Since our algorithms have update times with worst-case guarantees, the number of changes to the solution (i.e. the recourse) is naturally limited. Our algorithms make choices based entirely on local information, which makes them automatically adaptive to the current arboricity of the graph. In other words, they are arboricity-oblivious, while they are arboricity-sensitive. This both simplifies and improves upon previous work, by having fewer assumptions or better asymptotic guarantees. As a consequence, one obtains an algorithm with improved efficiency for maintaining a (1+ε)(1+\varepsilon) approximation of the maximum subgraph density, and an algorithm for dynamic maximal matching whose worst-case update time is guaranteed to be upper bounded by O(α+lognlogα)O(\alpha + \log n\log \alpha), where α\alpha is the arboricity at the time of the update

    On Fully Dynamic Graph Sparsifiers

    No full text
    We initiate the study of dynamic algorithms for graph sparsification problems and obtain fully dynamic algorithms, allowing both edge insertions and edge deletions, that take polylogarithmic time after each update in the graph. Our three main results are as follows. First, we give a fully dynamic algorithm for maintaining a (1±ϵ) (1 \pm \epsilon) -spectral sparsifier with amortized update time poly(logn,ϵ1)poly(\log{n}, \epsilon^{-1}). Second, we give a fully dynamic algorithm for maintaining a (1±ϵ) (1 \pm \epsilon) -cut sparsifier with \emph{worst-case} update time poly(logn,ϵ1)poly(\log{n}, \epsilon^{-1}). Both sparsifiers have size npoly(logn,ϵ1) n \cdot poly(\log{n}, \epsilon^{-1}). Third, we apply our dynamic sparsifier algorithm to obtain a fully dynamic algorithm for maintaining a (1+ϵ)(1 + \epsilon)-approximation to the value of the maximum flow in an unweighted, undirected, bipartite graph with amortized update time poly(logn,ϵ1)poly(\log{n}, \epsilon^{-1})

    Approximating Properties of Data Streams

    Get PDF
    In this dissertation, we present algorithms that approximate properties in the data stream model, where elements of an underlying data set arrive sequentially, but algorithms must use space sublinear in the size of the underlying data set. We first study the problem of finding all k-periods of a length-n string S, presented as a data stream. S is said to have k-period p if its prefix of length n − p differs from its suffix of length n − p in at most k locations. We give algorithms to compute the k-periods of a string S using poly(k, log n) bits of space and we complement these results with comparable lower bounds. We then study the problem of identifying a longest substring of strings S and T of length n that forms a d-near-alignment under the edit distance, in the simultaneous streaming model. In this model, symbols of strings S and T are streamed at the same time and form a d-near-alignment if the distance between them in some given metric is at most d. We give several algorithms, including an exact one-pass algorithm that uses O(d2 + d log n) bits of space. We then consider the distinct elements and `p-heavy hitters problems in the sliding window model, where only the most recent n elements in the data stream form the underlying set. We first introduce the composable histogram, a simple twist on the exponential (Datar et al., SODA 2002) and smooth histograms (Braverman and Ostrovsky, FOCS 2007) that may be of independent interest. We then show that the composable histogram along with a careful combination of existing techniques to track either the identity or frequency of a few specific items suffices to obtain algorithms for both distinct elements and `p-heavy hitters that is nearly optimal in both n and c. Finally, we consider the problem of estimating the maximum weighted matching of a graph whose edges are revealed in a streaming fashion. We develop a reduction from the maximum weighted matching problem to the maximum cardinality matching problem that only doubles the approximation factor of a streaming algorithm developed for the maximum cardinality matching problem. As an application, we obtain an estimator for the weight of a maximum weighted matching in bounded-arboricity graphs and in particular, a (48 + )-approximation estimator for the weight of a maximum weighted matching in planar graphs

    Dynamic Matching: Reducing Integral Algorithms to Approximately-Maximal Fractional Algorithms

    Get PDF
    We present a simple randomized reduction from fully-dynamic integral matching algorithms to fully-dynamic "approximately-maximal" fractional matching algorithms. Applying this reduction to the recent fractional matching algorithm of Bhattacharya, Henzinger, and Nanongkai (SODA 2017), we obtain a novel result for the integral problem. Specifically, our main result is a randomized fully-dynamic (2+epsilon)-approximate integral matching algorithm with small polylog worst-case update time. For the (2+epsilon)-approximation regime only a fractional fully-dynamic (2+epsilon)-matching algorithm with worst-case polylog update time was previously known, due to Bhattacharya et al. (SODA 2017). Our algorithm is the first algorithm that maintains approximate matchings with worst-case update time better than polynomial, for any constant approximation ratio. As a consequence, we also obtain the first constant-approximate worst-case polylogarithmic update time maximum weight matching algorithm
    corecore