Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method

Abstract

The simplex method for linear programming is known to be highly efficient in practice, and understanding its performance from a theoretical perspective is an active research topic. The framework of smoothed analysis, first introduced by Spielman and Teng (JACM '04) for this purpose, defines the smoothed complexity of solving a linear program with dd variables and nn constraints as the expected running time when Gaussian noise of variance Οƒ2\sigma^2 is added to the LP data. We prove that the smoothed complexity of the simplex method is O(Οƒβˆ’3/2d13/4log⁑7/4n)O(\sigma^{-3/2} d^{13/4}\log^{7/4} n), improving the dependence on 1/Οƒ1/\sigma compared to the previous bound of O(Οƒβˆ’2d2log⁑n)O(\sigma^{-2} d^2\sqrt{\log n}). We accomplish this through a new analysis of the \emph{shadow bound}, key to earlier analyses as well. Illustrating the power of our new method, we use our method to prove a nearly tight upper bound on the smoothed complexity of two-dimensional polygons. We also establish the first non-trivial lower bound on the smoothed complexity of the simplex method, proving that the \emph{shadow vertex simplex method} requires at least Ξ©(min⁑(Οƒβˆ’1/2dβˆ’1/2logβ‘βˆ’1/4d,2d))\Omega \Big(\min \big(\sigma^{-1/2} d^{-1/2}\log^{-1/4} d,2^d \big) \Big) pivot steps with high probability. A key part of our analysis is a new variation on the extended formulation for the regular 2k2^k-gon. We end with a numerical experiment that suggests this analysis could be further improved.Comment: 41 pages, 5 figure

    Similar works

    Full text

    thumbnail-image

    Available Versions