59 research outputs found

    The combined basic LP and affine IP relaxation for promise VCSPs on infinite domains

    Full text link
    Convex relaxations have been instrumental in solvability of constraint satisfaction problems (CSPs), as well as in the three different generalisations of CSPs: valued CSPs, infinite-domain CSPs, and most recently promise CSPs. In this work, we extend an existing tractability result to the three generalisations of CSPs combined: We give a sufficient condition for the combined basic linear programming and affine integer programming relaxation for exact solvability of promise valued CSPs over infinite-domains. This extends a result of Brakensiek and Guruswami [SODA'20] for promise (non-valued) CSPs (on finite domains).Comment: Full version of an MFCS'20 pape

    Valued Constraint Satisfaction Problems over Infinite Domains

    Get PDF
    The object of the thesis is the computational complexity of certain combinatorial optimisation problems called \emph{valued constraint satisfaction problems}, or \emph{VCSPs} for short. The requirements and optimisation criteria of these problems are expressed by sums of \emph{(valued) constraints} (also called \emph{cost functions}). More precisely, the input of a VCSP consists of a finite set of variables, a finite set of cost functions that depend on these variables, and a cost uu; the task is to find values for the variables such that the sum of the cost functions is at most uu. By restricting the set of possible cost functions in the input, a great variety of computational optimisation problems can be modelled as VCSPs. Recently, the computational complexity of all VCSPs for finite sets of cost functions over a finite domain has been classified. Many natural optimisation problems, however, cannot be formulated as VCSPs over a finite domain. We initiate the systematic investigation of infinite-domain VCSPs by studying the complexity of VCSPs for piecewise linear (PL) and piecewise linear homogeneous (PLH) cost functions. The VCSP for a finite set of PLH cost functions can be solved in polynomial time if the cost functions are improved by fully symmetric fractional operations of all arities. We show this by (polynomial-time many-one) reducing the problem to a finite-domain VCSP which can be solved using a linear programming relaxation. We apply this result to show the polynomial-time tractability of VCSPs for {\it submodular} PLH cost functions, for {\it convex} PLH cost functions, and for {\it componentwise increasing} PLH cost functions; in fact, we show that submodular PLH functions and componentwise increasing PLH functions form maximally tractable classes of PLH cost functions. We define the notion of {\it expressive power} for sets of cost functions over arbitrary domains, and discuss the relation between the expressive power and the set of fractional operations improving the same set of cost functions over an arbitrary countable domain. Finally, we provide a polynomial-time algorithm solving the restriction of the VCSP for {\it all} PL cost functions to a fixed number of variables

    The Combined Basic LP and Affine IP Relaxation for Promise VCSPs on Infinite Domains

    Get PDF
    Convex relaxations have been instrumental in solvability of constraint satisfaction problems (CSPs), as well as in the three different generalisations of CSPs: valued CSPs, infinite-domain CSPs, and most recently promise CSPs. In this work, we extend an existing tractability result to the three generalisations of CSPs combined: We give a sufficient condition for the combined basic linear programming and affine integer programming relaxation for exact solvability of promise valued CSPs over infinite-domains. This extends a result of Brakensiek and Guruswami [SODA\u2720] for promise (non-valued) CSPs (on finite domains)

    New Dependencies of Hierarchies in Polynomial Optimization

    Full text link
    We compare four key hierarchies for solving Constrained Polynomial Optimization Problems (CPOP): Sum of Squares (SOS), Sum of Diagonally Dominant Polynomials (SDSOS), Sum of Nonnegative Circuits (SONC), and the Sherali Adams (SA) hierarchies. We prove a collection of dependencies among these hierarchies both for general CPOPs and for optimization problems on the Boolean hypercube. Key results include for the general case that the SONC and SOS hierarchy are polynomially incomparable, while SDSOS is contained in SONC. A direct consequence is the non-existence of a Putinar-like Positivstellensatz for SDSOS. On the Boolean hypercube, we show as a main result that Schm\"udgen-like versions of the hierarchies SDSOS*, SONC*, and SA* are polynomially equivalent. Moreover, we show that SA* is contained in any Schm\"udgen-like hierarchy that provides a O(n) degree bound.Comment: 26 pages, 4 figure

    Discrete Convex Functions on Graphs and Their Algorithmic Applications

    Full text link
    The present article is an exposition of a theory of discrete convex functions on certain graph structures, developed by the author in recent years. This theory is a spin-off of discrete convex analysis by Murota, and is motivated by combinatorial dualities in multiflow problems and the complexity classification of facility location problems on graphs. We outline the theory and algorithmic applications in combinatorial optimization problems

    Solving constraint-satisfaction problems with distributed neocortical-like neuronal networks

    Get PDF
    Finding actions that satisfy the constraints imposed by both external inputs and internal representations is central to decision making. We demonstrate that some important classes of constraint satisfaction problems (CSPs) can be solved by networks composed of homogeneous cooperative-competitive modules that have connectivity similar to motifs observed in the superficial layers of neocortex. The winner-take-all modules are sparsely coupled by programming neurons that embed the constraints onto the otherwise homogeneous modular computational substrate. We show rules that embed any instance of the CSPs planar four-color graph coloring, maximum independent set, and Sudoku on this substrate, and provide mathematical proofs that guarantee these graph coloring problems will convergence to a solution. The network is composed of non-saturating linear threshold neurons. Their lack of right saturation allows the overall network to explore the problem space driven through the unstable dynamics generated by recurrent excitation. The direction of exploration is steered by the constraint neurons. While many problems can be solved using only linear inhibitory constraints, network performance on hard problems benefits significantly when these negative constraints are implemented by non-linear multiplicative inhibition. Overall, our results demonstrate the importance of instability rather than stability in network computation, and also offer insight into the computational role of dual inhibitory mechanisms in neural circuits.Comment: Accepted manuscript, in press, Neural Computation (2018

    Revisiting the Linear Programming Relaxation Approach to Gibbs Energy Minimization and Weighted Constraint Satisfaction

    Get PDF
    We present a number of contributions to the LP relaxation approach to weighted constraint satisfaction (= Gibbs energy minimization). We link this approach to many works from constraint programming, which relation has so far been ignored in machine vision and learning. While the approach has been mostly considered only for binary constraints, we generalize it to n-ary constraints in a simple and natural way. This includes a simple algorithm to minimize the LP-based upper bound, n-ary max-sum diffusion – however, we consider using other bound-optimizing algorithms as well. The diffusion iteration is tractable for a certain class of higharity constraints represented as a black-box, which is analogical to propagators for global constraints CSP. Diffusion exactly solves permuted n-ary supermodular problems. A hierarchy of gradually tighter LP relaxations is obtained simply by adding various zero constraints and coupling them in various ways to existing constraints. Zero constraints can be added incrementally, which leads to a cutting plane algorithm. The separation problem is formulated as finding an unsatisfiable subproblem of a CSP
    • …
    corecore