7,720 research outputs found

    An Efficient Algorithm for Computing Network Reliability in Small Treewidth

    Full text link
    We consider the classic problem of Network Reliability. A network is given together with a source vertex, one or more target vertices, and probabilities assigned to each of the edges. Each edge appears in the network with its associated probability and the problem is to determine the probability of having at least one source-to-target path. This problem is known to be NP-hard. We present a linear-time fixed-parameter algorithm based on a parameter called treewidth, which is a measure of tree-likeness of graphs. Network Reliability was already known to be solvable in polynomial time for bounded treewidth, but there were no concrete algorithms and the known methods used complicated structures and were not easy to implement. We provide a significantly simpler and more intuitive algorithm that is much easier to implement. We also report on an implementation of our algorithm and establish the applicability of our approach by providing experimental results on the graphs of subway and transit systems of several major cities, such as London and Tokyo. To the best of our knowledge, this is the first exact algorithm for Network Reliability that can scale to handle real-world instances of the problem.Comment: 14 page

    Exponential Time Complexity of the Permanent and the Tutte Polynomial

    Get PDF
    We show conditional lower bounds for well-studied #P-hard problems: (a) The number of satisfying assignments of a 2-CNF formula with n variables cannot be counted in time exp(o(n)), and the same is true for computing the number of all independent sets in an n-vertex graph. (b) The permanent of an n x n matrix with entries 0 and 1 cannot be computed in time exp(o(n)). (c) The Tutte polynomial of an n-vertex multigraph cannot be computed in time exp(o(n)) at most evaluation points (x,y) in the case of multigraphs, and it cannot be computed in time exp(o(n/polylog n)) in the case of simple graphs. Our lower bounds are relative to (variants of) the Exponential Time Hypothesis (ETH), which says that the satisfiability of n-variable 3-CNF formulas cannot be decided in time exp(o(n)). We relax this hypothesis by introducing its counting version #ETH, namely that the satisfying assignments cannot be counted in time exp(o(n)). In order to use #ETH for our lower bounds, we transfer the sparsification lemma for d-CNF formulas to the counting setting

    Improved bounds and algorithms for graph cuts and network reliability

    Full text link
    Karger (SIAM Journal on Computing, 1999) developed the first fully-polynomial approximation scheme to estimate the probability that a graph GG becomes disconnected, given that its edges are removed independently with probability pp. This algorithm runs in n5+o(1)ϵ−3n^{5+o(1)} \epsilon^{-3} time to obtain an estimate within relative error ϵ\epsilon. We improve this run-time through algorithmic and graph-theoretic advances. First, there is a certain key sub-problem encountered by Karger, for which a generic estimation procedure is employed, we show that this has a special structure for which a much more efficient algorithm can be used. Second, we show better bounds on the number of edge cuts which are likely to fail. Here, Karger's analysis uses a variety of bounds for various graph parameters, we show that these bounds cannot be simultaneously tight. We describe a new graph parameter, which simultaneously influences all the bounds used by Karger, and obtain much tighter estimates of the cut structure of GG. These techniques allow us to improve the runtime to n3+o(1)ϵ−2n^{3+o(1)} \epsilon^{-2}, our results also rigorously prove certain experimental observations of Karger & Tai (Proc. ACM-SIAM Symposium on Discrete Algorithms, 1997). Our rigorous proofs are motivated by certain non-rigorous differential-equation approximations which, however, provably track the worst-case trajectories of the relevant parameters. A key driver of Karger's approach (and other cut-related results) is a bound on the number of small cuts: we improve these estimates when the min-cut size is "small" and odd, augmenting, in part, a result of Bixby (Bulletin of the AMS, 1974)
    • …
    corecore