14 research outputs found
Embedding Formulations and Complexity for Unions of Polyhedra
It is well known that selecting a good Mixed Integer Programming (MIP) formulation is crucial for an effective solution with state-of-the art solvers. While best practices and guidelines for constructing good formulations abound, there is rarely a systematic construction leading to the best possible formulation. We introduce embedding formulations and complexity as a new MIP formulation paradigm for systematically constructing formulations for disjunctive constraints that are optimal with respect to size. More specifically,
they yield the smallest possible ideal formulation (i.e. one whose LP relaxation has integral extreme points) among all formulations that only use 0-1 auxiliary variables. We use the paradigm to characterize optimal
formulations for SOS2 constraints and certain piecewise linear functions of two variables. We also show that the resulting formulations can provide a significant computational advantage over all known formulations
for piecewise linear functions.United States. National Science Foundation. (Grant CMMI-13516
Small and strong formulations for unions of convex sets from the Cayley embedding
There is often a significant trade-off between formulation strength and size in mixed integer programming (MIP). When modelling convex disjunctive constraints (e.g. unions of convex sets) this trade-off can be resolved by adding auxiliary continuous variables. However, adding these variables can result in a deterioration of the computational effectiveness of the formulation. For this reason, there has been considerable interest in constructing strong formulations that do not use continuous auxiliary variables. We introduce a technique to construct formulations without these detrimental continuous auxiliary variables. To develop this technique we introduce a natural nonpolyhedral generalization of the Cayley embedding of a family of polytopes and show it inherits many geometric properties of the original embedding. We then show how the associated formulation technique can be used to construct small and strong formulation for a wide range of disjunctive constraints. In particular, we show it can recover and generalize all known strong formulations without continuous auxiliary variables.National Science Foundation (U.S.) (grant CMMI-1351619
Neural Network Verification as Piecewise Linear Optimization: Formulations for the Composition of Staircase Functions
We present a technique for neural network verification using mixed-integer
programming (MIP) formulations. We derive a \emph{strong formulation} for each
neuron in a network using piecewise linear activation functions. Additionally,
as in general, these formulations may require an exponential number of
inequalities, we also derive a separation procedure that runs in super-linear
time in the input dimension. We first introduce and develop our technique on
the class of \emph{staircase} functions, which generalizes the ReLU, binarized,
and quantized activation functions. We then use results for staircase
activation functions to obtain a separation method for general piecewise linear
activation functions. Empirically, using our strong formulation and separation
technique, we can reduce the computational time in exact verification settings
based on MIP and improve the false negative rate for inexact verifiers relying
on the relaxation of the MIP formulation