7,860 research outputs found

    Non-polynomial Worst-Case Analysis of Recursive Programs

    Full text link
    We study the problem of developing efficient approaches for proving worst-case bounds of non-deterministic recursive programs. Ranking functions are sound and complete for proving termination and worst-case bounds of nonrecursive programs. First, we apply ranking functions to recursion, resulting in measure functions. We show that measure functions provide a sound and complete approach to prove worst-case bounds of non-deterministic recursive programs. Our second contribution is the synthesis of measure functions in nonpolynomial forms. We show that non-polynomial measure functions with logarithm and exponentiation can be synthesized through abstraction of logarithmic or exponentiation terms, Farkas' Lemma, and Handelman's Theorem using linear programming. While previous methods obtain worst-case polynomial bounds, our approach can synthesize bounds of the form O(nlogn)\mathcal{O}(n\log n) as well as O(nr)\mathcal{O}(n^r) where rr is not an integer. We present experimental results to demonstrate that our approach can obtain efficiently worst-case bounds of classical recursive algorithms such as (i) Merge-Sort, the divide-and-conquer algorithm for the Closest-Pair problem, where we obtain O(nlogn)\mathcal{O}(n \log n) worst-case bound, and (ii) Karatsuba's algorithm for polynomial multiplication and Strassen's algorithm for matrix multiplication, where we obtain O(nr)\mathcal{O}(n^r) bound such that rr is not an integer and close to the best-known bounds for the respective algorithms.Comment: 54 Pages, Full Version to CAV 201

    Beyond Worst-Case Analysis for Joins with Minesweeper

    Full text link
    We describe a new algorithm, Minesweeper, that is able to satisfy stronger runtime guarantees than previous join algorithms (colloquially, `beyond worst-case guarantees') for data in indexed search trees. Our first contribution is developing a framework to measure this stronger notion of complexity, which we call {\it certificate complexity}, that extends notions of Barbay et al. and Demaine et al.; a certificate is a set of propositional formulae that certifies that the output is correct. This notion captures a natural class of join algorithms. In addition, the certificate allows us to define a strictly stronger notion of runtime complexity than traditional worst-case guarantees. Our second contribution is to develop a dichotomy theorem for the certificate-based notion of complexity. Roughly, we show that Minesweeper evaluates β\beta-acyclic queries in time linear in the certificate plus the output size, while for any β\beta-cyclic query there is some instance that takes superlinear time in the certificate (and for which the output is no larger than the certificate size). We also extend our certificate-complexity analysis to queries with bounded treewidth and the triangle query.Comment: [This is the full version of our PODS'2014 paper.

    Stochastic optimization and worst-case analysis in monetary policy design

    Get PDF
    In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E6

    Worst--Case Analysis of Weber's Algorithm

    No full text
    11 pagesInternational audienceRecently, Ken Weber introduced an algorithm for finding the (a,b)(a,b)-pairs satisfying au+bv0(modk)au+bv\equiv 0\pmod{k}, with 0<a,b<k0<|a|,|b|<\sqrt{k}, where (u,k)(u,k) and (v,k)(v,k) are coprime. It is based on Sorenson's and Jebelean's ''kk-ary reduction'' algorithms. We provide a formula for N(k)N(k), the maximal number of iterations in the loop of Weber's GCD algorithm

    Worst-case analysis of identification - BIBO robustness for closed loop data

    Get PDF
    This paper deals with the worst-case analysis of identification of linear shift-invariant (possibly) infinite-dimensional systems. A necessary and sufficient input richness condition for the existence of robustly convergent identification algorithms in l1 is given. A closed-loop identification setting is studied to cover both stable and unstable (but BIBO stabilizable) systems. Identification (or modeling) error is then measured by distance functions which lead to the weakest convergence notions for systems such that closed-loop stability, in the sense of BIBO stability, is a robust property. Worst-case modeling error bounds in several distance functions are include

    Worst-case analysis of heap allocations

    Get PDF
    Abstract. In object oriented languages, dynamic memory allocation is a fundamental concept. When using such a language in hard real-time systems, it becomes important to bound both the worst-case execution time and the worst-case memory consumption. In this paper, we present an analysis to determine the worst-case heap allocations of tasks. The analysis builds upon techniques that are well established for worst-case execution time analysis. The difference is that the cost function is not the execution time of instructions in clock cycles, but the allocation in bytes. In contrast to worst-case execution time analysis, worst-case heap allocation analysis is not processor dependent. However, the cost function depends on the object layout of the runtime system. The analysis is evaluated with several real-time benchmarks to establish the usefulness of the analysis, and to compare the memory consumption of different object layouts.
    corecore