21 research outputs found

    Quadratization of Symmetric Pseudo-Boolean Functions

    Get PDF
    A pseudo-Boolean function is a real-valued function f(x)=f(x1,x2,,xn)f(x)=f(x_1,x_2,\ldots,x_n) of nn binary variables; that is, a mapping from {0,1}n\{0,1\}^n to R\mathbb{R}. For a pseudo-Boolean function f(x)f(x) on {0,1}n\{0,1\}^n, we say that g(x,y)g(x,y) is a quadratization of ff if g(x,y)g(x,y) is a quadratic polynomial depending on xx and on mm auxiliary binary variables y1,y2,,ymy_1,y_2,\ldots,y_m such that f(x)=min{g(x,y):y{0,1}m}f(x)= \min \{g(x,y) : y \in \{0,1\}^m \} for all x{0,1}nx \in \{0,1\}^n. By means of quadratizations, minimization of ff is reduced to minimization (over its extended set of variables) of the quadratic function g(x,y)g(x,y). This is of some practical interest because minimization of quadratic functions has been thoroughly studied for the last few decades, and much progress has been made in solving such problems exactly or heuristically. A related paper \cite{ABCG} initiated a systematic study of the minimum number of auxiliary yy-variables required in a quadratization of an arbitrary function ff (a natural question, since the complexity of minimizing the quadratic function g(x,y)g(x,y) depends, among other factors, on the number of binary variables). In this paper, we determine more precisely the number of auxiliary variables required by quadratizations of symmetric pseudo-Boolean functions f(x)f(x), those functions whose value depends only on the Hamming weight of the input xx (the number of variables equal to 11).Comment: 17 page

    Higher Order Energies for Image Segmentation

    Get PDF
    A novel energy minimization method for general higher-order binary energy functions is proposed in this paper. We first relax a discrete higher-order function to a continuous one, and use the Taylor expansion to obtain an approximate lower-order function, which is optimized by the quadratic pseudo-boolean optimization (QPBO) or other discrete optimizers. The minimum solution of this lower-order function is then used as a new local point, where we expand the original higher-order energy function again. Our algorithm does not restrict to any specific form of the higher-order binary function or bring in extra auxiliary variables. For concreteness, we show an application of segmentation with the appearance entropy, which is efficiently solved by our method. Experimental results demonstrate that our method outperforms state-of-the-art methods

    IST Austria Technical Report

    Get PDF
    We introduce TopoCut: a new way to integrate knowledge about topological properties (TPs) into random field image segmentation model. Instead of including TPs as additional constraints during minimization of the energy function, we devise an efficient algorithm for modifying the unary potentials such that the resulting segmentation is guaranteed with the desired properties. Our method is more flexible in the sense that it handles more topology constraints than previous methods, which were only able to enforce pairwise or global connectivity. In particular, our method is very fast, making it for the first time possible to enforce global topological properties in practical image segmentation tasks

    Disjunctive normal shape Boltzmann machine

    Get PDF
    Shape Boltzmann machine (a type of Deep Boltzmann machine) is a powerful tool for shape modelling; however, has some drawbacks in representation of local shape parts. Disjunctive Normal Shape Model (DNSM) is a strong shape model that can effectively represent local parts of objects. In this paper, we propose a new shape model based on Shape Boltzmann Machine and Disjunctive Normal Shape Model which we call Disjunctive Normal Shape Boltzmann Machine (DNSBM). DNSBM learns binary distributions of shapes by taking both local and global shape constraints into account using a type of Deep Boltzmann Machine. The samples generated using DNSBM look realistic. Moreover, DNSBM is capable of generating novel samples that differ from training examples by exploiting the local shape representation capability of DNSM. We demonstrate the performance of DNSBM for shape completion on two different data sets in which exploitation of local shape parts is important for capturing the statistical variability of the underlying shape distributions. Experimental results show that DNSBM is a strong model for representing shapes that are composed of local parts

    Generalized roof duality

    Get PDF
    AbstractThe roof dual bound for quadratic unconstrained binary optimization is the basis for several methods for efficiently computing the solution to many hard combinatorial problems. It works by constructing the tightest possible lower-bounding submodular function, and instead of minimizing the original objective function, the relaxation is minimized. However, for higher-order problems the technique has been less successful. A standard technique is to first reduce the problem into a quadratic one by introducing auxiliary variables and then apply the quadratic roof dual bound, but this may lead to loose bounds.We generalize the roof duality technique to higher-order optimization problems. Similarly to the quadratic case, optimal relaxations are defined to be the ones that give the maximum lower bound. We show how submodular relaxations can efficiently be constructed in order to compute the generalized roof dual bound for general cubic and quartic pseudo-boolean functions. Further, we prove that important properties such as persistency still hold, which allows us to determine optimal values for some of the variables. From a practical point of view, we experimentally demonstrate that the technique outperforms the state of the art for a wide range of applications, both in terms of lower bounds and in the number of assigned variables
    corecore