343 research outputs found

    Generating all polynomial invariants in simple loops

    Get PDF
    AbstractThis paper presents a method for automatically generating all polynomial invariants in simple loops. It is first shown that the set of polynomials serving as loop invariants has the algebraic structure of an ideal. Based on this connection, a fixpoint procedure using operations on ideals and Gröbner basis constructions is proposed for finding all polynomial invariants. Most importantly, it is proved that the procedure terminates in at most m+1 iterations, where m is the number of program variables. The proof relies on showing that the irreducible components of the varieties associated with the ideals generated by the procedure either remain the same or increase their dimension at every iteration of the fixpoint procedure. This yields a correct and complete algorithm for inferring conjunctions of polynomial equalities as invariants. The method has been implemented in Maple using the Groebner package. The implementation has been used to automatically discover non-trivial invariants for several examples to illustrate the power of the technique

    Polynomial Invariants for Affine Programs

    Get PDF
    We exhibit an algorithm to compute the strongest polynomial (or algebraic) invariants that hold at each location of a given affine program (i.e., a program having only non-deterministic (as opposed to conditional) branching and all of whose assignments are given by affine expressions). Our main tool is an algebraic result of independent interest: given a finite set of rational square matrices of the same dimension, we show how to compute the Zariski closure of the semigroup that they generate

    A probabilistic interpretation of the Macdonald polynomials

    Full text link
    The two-parameter Macdonald polynomials are a central object of algebraic combinatorics and representation theory. We give a Markov chain on partitions of k with eigenfunctions the coefficients of the Macdonald polynomials when expanded in the power sum polynomials. The Markov chain has stationary distribution a new two-parameter family of measures on partitions, the inverse of the Macdonald weight (rescaled). The uniform distribution on permutations and the Ewens sampling formula are special cases. The Markov chain is a version of the auxiliary variables algorithm of statistical physics. Properties of the Macdonald polynomials allow a sharp analysis of the running time. In natural cases, a bounded number of steps suffice for arbitrarily large k

    Extension of information geometry for modelling non-statistical systems

    Full text link
    In this dissertation, an abstract formalism extending information geometry is introduced. This framework encompasses a broad range of modelling problems, including possible applications in machine learning and in the information theoretical foundations of quantum theory. Its purely geometrical foundations make no use of probability theory and very little assumptions about the data or the models are made. Starting only from a divergence function, a Riemannian geometrical structure consisting of a metric tensor and an affine connection is constructed and its properties are investigated. Also the relation to information geometry and in particular the geometry of exponential families of probability distributions is elucidated. It turns out this geometrical framework offers a straightforward way to determine whether or not a parametrised family of distributions can be written in exponential form. Apart from the main theoretical chapter, the dissertation also contains a chapter of examples illustrating the application of the formalism and its geometric properties, a brief introduction to differential geometry and a historical overview of the development of information geometry.Comment: PhD thesis, University of Antwerp, Advisors: Prof. dr. Jan Naudts and Prof. dr. Jacques Tempere, December 2014, 108 page

    A task-based approach to parallel parametric linear programming solving, and application to polyhedral computations

    Full text link
    Parametric linear programming is a central operation for polyhedral computations, as well as in certain control applications.Here we propose a task-based scheme for parallelizing it, with quasi-linear speedup over large problems.This type of parallel applications is challenging, because several tasks mightbe computing the same region. In this paper, we are presenting thealgorithm itself with a parallel redundancy elimination algorithm, andconducting a thorough performance analysis.Comment: arXiv admin note: text overlap with arXiv:1904.0607
    • …
    corecore