341 research outputs found

    Generalized roof duality and bisubmodular functions

    Full text link
    Consider a convex relaxation f^\hat f of a pseudo-boolean function ff. We say that the relaxation is {\em totally half-integral} if f^(x)\hat f(x) is a polyhedral function with half-integral extreme points xx, and this property is preserved after adding an arbitrary combination of constraints of the form xi=xjx_i=x_j, xi=1xjx_i=1-x_j, and xi=γx_i=\gamma where \gamma\in\{0, 1, 1/2} is a constant. A well-known example is the {\em roof duality} relaxation for quadratic pseudo-boolean functions ff. We argue that total half-integrality is a natural requirement for generalizations of roof duality to arbitrary pseudo-boolean functions. Our contributions are as follows. First, we provide a complete characterization of totally half-integral relaxations f^\hat f by establishing a one-to-one correspondence with {\em bisubmodular functions}. Second, we give a new characterization of bisubmodular functions. Finally, we show some relationships between general totally half-integral relaxations and relaxations based on the roof duality.Comment: 14 pages. Shorter version to appear in NIPS 201

    Generalized roof duality

    Get PDF
    AbstractThe roof dual bound for quadratic unconstrained binary optimization is the basis for several methods for efficiently computing the solution to many hard combinatorial problems. It works by constructing the tightest possible lower-bounding submodular function, and instead of minimizing the original objective function, the relaxation is minimized. However, for higher-order problems the technique has been less successful. A standard technique is to first reduce the problem into a quadratic one by introducing auxiliary variables and then apply the quadratic roof dual bound, but this may lead to loose bounds.We generalize the roof duality technique to higher-order optimization problems. Similarly to the quadratic case, optimal relaxations are defined to be the ones that give the maximum lower bound. We show how submodular relaxations can efficiently be constructed in order to compute the generalized roof dual bound for general cubic and quartic pseudo-boolean functions. Further, we prove that important properties such as persistency still hold, which allows us to determine optimal values for some of the variables. From a practical point of view, we experimentally demonstrate that the technique outperforms the state of the art for a wide range of applications, both in terms of lower bounds and in the number of assigned variables

    Discrete Optimization in Early Vision - Model Tractability Versus Fidelity

    Get PDF
    Early vision is the process occurring before any semantic interpretation of an image takes place. Motion estimation, object segmentation and detection are all parts of early vision, but recognition is not. Some models in early vision are easy to perform inference with---they are tractable. Others describe the reality well---they have high fidelity. This thesis improves the tractability-fidelity trade-off of the current state of the art by introducing new discrete methods for image segmentation and other problems of early vision. The first part studies pseudo-boolean optimization, both from a theoretical perspective as well as a practical one by introducing new algorithms. The main result is the generalization of the roof duality concept to polynomials of higher degree than two. Another focus is parallelization; discrete optimization methods for multi-core processors, computer clusters, and graphical processing units are presented. Remaining in an image segmentation context, the second part studies parametric problems where a set of model parameters and a segmentation are estimated simultaneously. For a small number of parameters these problems can still be optimally solved. One application is an optimal method for solving the two-phase Mumford-Shah functional. The third part shifts the focus to curvature regularization---where the commonly used length and area penalization is replaced by curvature in two and three dimensions. These problems can be discretized over a mesh and special attention is given to the mesh geometry. Specifically, hexagonal meshes in the plane are compared to square ones and a method for generating adaptive meshes is introduced and evaluated. The framework is then extended to curvature regularization of surfaces. Finally, the thesis is concluded by three applications to early vision problems: cardiac MRI segmentation, image registration, and cell classification

    Decomposition of Trees and Paths via Correlation

    Full text link
    We study the problem of decomposing (clustering) a tree with respect to costs attributed to pairs of nodes, so as to minimize the sum of costs for those pairs of nodes that are in the same component (cluster). For the general case and for the special case of the tree being a star, we show that the problem is NP-hard. For the special case of the tree being a path, this problem is known to be polynomial time solvable. We characterize several classes of facets of the combinatorial polytope associated with a formulation of this clustering problem in terms of lifted multicuts. In particular, our results yield a complete totally dual integral (TDI) description of the lifted multicut polytope for paths, which establishes a connection to the combinatorial properties of alternative formulations such as set partitioning.Comment: v2 is a complete revisio

    Maximum Persistency in Energy Minimization

    Full text link
    We consider discrete pairwise energy minimization problem (weighted constraint satisfaction, max-sum labeling) and methods that identify a globally optimal partial assignment of variables. When finding a complete optimal assignment is intractable, determining optimal values for a part of variables is an interesting possibility. Existing methods are based on different sufficient conditions. We propose a new sufficient condition for partial optimality which is: (1) verifiable in polynomial time (2) invariant to reparametrization of the problem and permutation of labels and (3) includes many existing sufficient conditions as special cases. We pose the problem of finding the maximum optimal partial assignment identifiable by the new sufficient condition. A polynomial method is proposed which is guaranteed to assign same or larger part of variables than several existing approaches. The core of the method is a specially constructed linear program that identifies persistent assignments in an arbitrary multi-label setting.Comment: Extended technical report for the CVPR 2014 paper. Update: correction to the proof of characterization theore

    A max-flow approach to improved lower bounds for quadratic unconstrained binary optimization (QUBO)

    Get PDF
    AbstractThe “roof dual” of a QUBO (Quadratic Unconstrained Binary Optimization) problem has been introduced in [P.L. Hammer, P. Hansen, B. Simeone, Roof duality, complementation and persistency in quadratic 0–1 optimization, Mathematical Programming 28 (1984) 121–155]; it provides a bound to the optimum value, along with a polynomial test of the sharpness of this bound, and (due to a “persistency” result) it also determines the values of some of the variables at the optimum. In this paper we provide a graph-theoretic approach to provide bounds, which includes as a special case the roof dual bound, and show that these bounds can be computed in O(n3) time by using network flow techniques. We also obtain a decomposition theorem for quadratic pseudo-Boolean functions, improving the persistency result of [P.L. Hammer, P. Hansen, B. Simeone, Roof duality, complementation and persistency in quadratic 0–1 optimization, Mathematical Programming 28 (1984) 121–155]. Finally, we show that the proposed bounds (including roof duality) can be applied in an iterated way to obtain significantly better bounds. Computational experiments on problems up to thousands of variables are presented

    Complexity of Discrete Energy Minimization Problems

    Full text link
    Discrete energy minimization is widely-used in computer vision and machine learning for problems such as MAP inference in graphical models. The problem, in general, is notoriously intractable, and finding the global optimal solution is known to be NP-hard. However, is it possible to approximate this problem with a reasonable ratio bound on the solution quality in polynomial time? We show in this paper that the answer is no. Specifically, we show that general energy minimization, even in the 2-label pairwise case, and planar energy minimization with three or more labels are exp-APX-complete. This finding rules out the existence of any approximation algorithm with a sub-exponential approximation ratio in the input size for these two problems, including constant factor approximations. Moreover, we collect and review the computational complexity of several subclass problems and arrange them on a complexity scale consisting of three major complexity classes -- PO, APX, and exp-APX, corresponding to problems that are solvable, approximable, and inapproximable in polynomial time. Problems in the first two complexity classes can serve as alternative tractable formulations to the inapproximable ones. This paper can help vision researchers to select an appropriate model for an application or guide them in designing new algorithms.Comment: ECCV'16 accepte

    Combinatorial persistency criteria for multicut and max-cut

    Full text link
    In combinatorial optimization, partial variable assignments are called persistent if they agree with some optimal solution. We propose persistency criteria for the multicut and max-cut problem as well as fast combinatorial routines to verify them. The criteria that we derive are based on mappings that improve feasible multicuts, respectively cuts. Our elementary criteria can be checked enumeratively. The more advanced ones rely on fast algorithms for upper and lower bounds for the respective cut problems and max-flow techniques for auxiliary min-cut problems. Our methods can be used as a preprocessing technique for reducing problem sizes or for computing partial optimality guarantees for solutions output by heuristic solvers. We show the efficacy of our methods on instances of both problems from computer vision, biomedical image analysis and statistical physics
    corecore