5 research outputs found

    IST Austria Technical Report

    Get PDF
    We consider the core algorithmic problems related to verification of systems with respect to three classical quantitative properties, namely, the mean-payoff property, the ratio property, and the minimum initial credit for energy property. The algorithmic problem given a graph and a quantitative property asks to compute the optimal value (the infimum value over all traces) from every node of the graph. We consider graphs with constant treewidth, and it is well-known that the control-flow graphs of most programs have constant treewidth. Let nn denote the number of nodes of a graph, mm the number of edges (for constant treewidth graphs m=O(n)m=O(n)) and WW the largest absolute value of the weights. Our main theoretical results are as follows. First, for constant treewidth graphs we present an algorithm that approximates the mean-payoff value within a multiplicative factor of ϵ\epsilon in time O(nlog(n/ϵ))O(n \cdot \log (n/\epsilon)) and linear space, as compared to the classical algorithms that require quadratic time. Second, for the ratio property we present an algorithm that for constant treewidth graphs works in time O(nlog(ab))=O(nlog(nW))O(n \cdot \log (|a\cdot b|))=O(n\cdot\log (n\cdot W)), when the output is ab\frac{a}{b}, as compared to the previously best known algorithm with running time O(n2log(nW))O(n^2 \cdot \log (n\cdot W)). Third, for the minimum initial credit problem we show that (i)~for general graphs the problem can be solved in O(n2m)O(n^2\cdot m) time and the associated decision problem can be solved in O(nm)O(n\cdot m) time, improving the previous known O(n3mlog(nW))O(n^3\cdot m\cdot \log (n\cdot W)) and O(n2m)O(n^2 \cdot m) bounds, respectively; and (ii)~for constant treewidth graphs we present an algorithm that requires O(nlogn)O(n\cdot \log n) time, improving the previous known O(n4log(nW))O(n^4 \cdot \log (n \cdot W)) bound. We have implemented some of our algorithms and show that they present a significant speedup on standard benchmarks

    Negative-Weight Single-Source Shortest Paths in Near-Linear Time: Now Faster!

    Full text link
    In this work we revisit the fundamental Single-Source Shortest Paths (SSSP) problem with possibly negative edge weights. A recent breakthrough result by Bernstein, Nanongkai and Wulff-Nilsen established a near-linear O(mlog8(n)log(W))O(m \log^8(n) \log(W))-time algorithm for negative-weight SSSP, where WW is an upper bound on the magnitude of the smallest negative-weight edge. In this work we improve the running time to O(mlog2(n)log(nW)loglogn)O(m \log^2(n) \log(nW) \log\log n), which is an improvement by nearly six log-factors. Some of these log-factors are easy to shave (e.g. replacing the priority queue used in Dijkstra's algorithm), while others are significantly more involved (e.g. to find negative cycles we design an algorithm reminiscent of noisy binary search and analyze it with drift analysis). As side results, we obtain an algorithm to compute the minimum cycle mean in the same running time as well as a new construction for computing Low-Diameter Decompositions in directed graphs

    IST Austria Thesis

    Get PDF
    This dissertation focuses on algorithmic aspects of program verification, and presents modeling and complexity advances on several problems related to the static analysis of programs, the stateless model checking of concurrent programs, and the competitive analysis of real-time scheduling algorithms. Our contributions can be broadly grouped into five categories. Our first contribution is a set of new algorithms and data structures for the quantitative and data-flow analysis of programs, based on the graph-theoretic notion of treewidth. It has been observed that the control-flow graphs of typical programs have special structure, and are characterized as graphs of small treewidth. We utilize this structural property to provide faster algorithms for the quantitative and data-flow analysis of recursive and concurrent programs. In most cases we make an algebraic treatment of the considered problem, where several interesting analyses, such as the reachability, shortest path, and certain kind of data-flow analysis problems follow as special cases. We exploit the constant-treewidth property to obtain algorithmic improvements for on-demand versions of the problems, and provide data structures with various tradeoffs between the resources spent in the preprocessing and querying phase. We also improve on the algorithmic complexity of quantitative problems outside the algebraic path framework, namely of the minimum mean-payoff, minimum ratio, and minimum initial credit for energy problems. Our second contribution is a set of algorithms for Dyck reachability with applications to data-dependence analysis and alias analysis. In particular, we develop an optimal algorithm for Dyck reachability on bidirected graphs, which are ubiquitous in context-insensitive, field-sensitive points-to analysis. Additionally, we develop an efficient algorithm for context-sensitive data-dependence analysis via Dyck reachability, where the task is to obtain analysis summaries of library code in the presence of callbacks. Our algorithm preprocesses libraries in almost linear time, after which the contribution of the library in the complexity of the client analysis is (i)~linear in the number of call sites and (ii)~only logarithmic in the size of the whole library, as opposed to linear in the size of the whole library. Finally, we prove that Dyck reachability is Boolean Matrix Multiplication-hard in general, and the hardness also holds for graphs of constant treewidth. This hardness result strongly indicates that there exist no combinatorial algorithms for Dyck reachability with truly subcubic complexity. Our third contribution is the formalization and algorithmic treatment of the Quantitative Interprocedural Analysis framework. In this framework, the transitions of a recursive program are annotated as good, bad or neutral, and receive a weight which measures the magnitude of their respective effect. The Quantitative Interprocedural Analysis problem asks to determine whether there exists an infinite run of the program where the long-run ratio of the bad weights over the good weights is above a given threshold. We illustrate how several quantitative problems related to static analysis of recursive programs can be instantiated in this framework, and present some case studies to this direction. Our fourth contribution is a new dynamic partial-order reduction for the stateless model checking of concurrent programs. Traditional approaches rely on the standard Mazurkiewicz equivalence between traces, by means of partitioning the trace space into equivalence classes, and attempting to explore a few representatives from each class. We present a new dynamic partial-order reduction method called the Data-centric Partial Order Reduction (DC-DPOR). Our algorithm is based on a new equivalence between traces, called the observation equivalence. DC-DPOR explores a coarser partitioning of the trace space than any exploration method based on the standard Mazurkiewicz equivalence. Depending on the program, the new partitioning can be even exponentially coarser. Additionally, DC-DPOR spends only polynomial time in each explored class. Our fifth contribution is the use of automata and game-theoretic verification techniques in the competitive analysis and synthesis of real-time scheduling algorithms for firm-deadline tasks. On the analysis side, we leverage automata on infinite words to compute the competitive ratio of real-time schedulers subject to various environmental constraints. On the synthesis side, we introduce a new instance of two-player mean-payoff partial-information games, and show how the synthesis of an optimal real-time scheduler can be reduced to computing winning strategies in this new type of games

    Approximating the minimum cycle mean

    No full text
    We consider directed graphs where each edge is labeled with an integer weight and study the fundamental algorithmic question of computing the value of a cycle with minimum mean weight. Our contributions are twofold: (1) First we show that the algorithmic question is reducible to the problem of a logarithmic number of min-plus matrix multiplications of n×n-matrices, where n is the number of vertices of the graph. (2) Second, when the weights are nonnegative, we present the first (1+ε)-approximation algorithm for the problem and the running time of our algorithm is Õ(nωlog3(nW/ε)/ε),1 where O(nω) is the time required for the classic n×n-matrix multiplication and W is the maximum value of the weights. With an additional O(log(nW/ε)) factor in space a cycle with approximately optimal weight can be computed within the same time bound
    corecore