70 research outputs found

    Efficiently Computing Minimal Sets of Critical Pairs

    Get PDF
    In the computation of a Gr"obner basis using Buchberger's algorithm, a key issue for improving the efficiency is to produce techniques for avoiding as many unnecessary critical pairs as possible. A good solution would be to avoid _all_ non-minimal critical pairs, and hence to process only a_minimal_ set of generators of the module generated by the critical syzygies. In this paper we show how to obtain that desired solution in the homogeneous case while retaining the same efficiency as with the classical implementation. As a consequence, we get a new Optimized Buchberger Algorithm.Comment: LaTeX using elsart.cls, 27 page

    The fan of an experimental design

    Get PDF

    A good leaf order on simplicial trees

    Full text link
    Using the existence of a good leaf in every simplicial tree, we order the facets of a simplicial tree in order to find combinatorial information about the Betti numbers of its facet ideal. Applications include an Eliahou-Kervaire splitting of the ideal, as well as a refinement of a recursive formula of H\`a and Van Tuyl for computing the graded Betti numbers of simplicial trees.Comment: 17 pages, to appear; Connections Between Algebra and Geometry, Birkhauser volume (2013

    Reverse Engineering Time Discrete Finite Dynamical Systems: A Feasible Undertaking?

    Get PDF
    With the advent of high-throughput profiling methods, interest in reverse engineering the structure and dynamics of biochemical networks is high. Recently an algorithm for reverse engineering of biochemical networks was developed by Laubenbacher and Stigler. It is a top-down approach using time discrete dynamical systems. One of its key steps includes the choice of a term order, a technicality imposed by the use of Gröbner-bases calculations. The aim of this paper is to identify minimal requirements on data sets to be used with this algorithm and to characterize optimal data sets. We found minimal requirements on a data set based on how many terms the functions to be reverse engineered display. Furthermore, we identified optimal data sets, which we characterized using a geometric property called “general position”. Moreover, we developed a constructive method to generate optimal data sets, provided a codimensional condition is fulfilled. In addition, we present a generalization of their algorithm that does not depend on the choice of a term order. For this method we derived a formula for the probability of finding the correct model, provided the data set used is optimal. We analyzed the asymptotic behavior of the probability formula for a growing number of variables n (i.e. interacting chemicals). Unfortunately, this formula converges to zero as fast as , where and . Therefore, even if an optimal data set is used and the restrictions in using term orders are overcome, the reverse engineering problem remains unfeasible, unless prodigious amounts of data are available. Such large data sets are experimentally impossible to generate with today's technologies

    Semidefinite Characterization and Computation of Real Radical Ideals

    Full text link
    For an ideal IR[x]I\subseteq\mathbb{R}[x] given by a set of generators, a new semidefinite characterization of its real radical I(VR(I))I(V_\mathbb{R}(I)) is presented, provided it is zero-dimensional (even if II is not). Moreover we propose an algorithm using numerical linear algebra and semidefinite optimization techniques, to compute all (finitely many) points of the real variety VR(I)V_\mathbb{R}(I) as well as a set of generators of the real radical ideal. The latter is obtained in the form of a border or Gr\"obner basis. The algorithm is based on moment relaxations and, in contrast to other existing methods, it exploits the real algebraic nature of the problem right from the beginning and avoids the computation of complex components.Comment: 41 page

    A Dynamic Algorithm for Groebner basis computation

    No full text
    corecore