We present a technique for neural network verification using mixed-integer
programming (MIP) formulations. We derive a \emph{strong formulation} for each
neuron in a network using piecewise linear activation functions. Additionally,
as in general, these formulations may require an exponential number of
inequalities, we also derive a separation procedure that runs in super-linear
time in the input dimension. We first introduce and develop our technique on
the class of \emph{staircase} functions, which generalizes the ReLU, binarized,
and quantized activation functions. We then use results for staircase
activation functions to obtain a separation method for general piecewise linear
activation functions. Empirically, using our strong formulation and separation
technique, we can reduce the computational time in exact verification settings
based on MIP and improve the false negative rate for inexact verifiers relying
on the relaxation of the MIP formulation