1 research outputs found
Recommended from our members
The Grothendieck Constant is Strictly Smaller than Krivine's Bound
The (real) Grothendieck constant is the infimum over those ∈(0,∞) such that for every ,∈ℕ and every × real matrix () we have
max{}=1,{}=1⊆+−1∑=1∑=1⟨,⟩⩽max{}=1,{}=1⊆{−1,1}∑=1∑=1.
The classical Grothendieck inequality asserts the nonobvious fact that the above inequality does hold true for some ∈(0,∞) that is independent of , and () . Since Grothendieck’s 1953 discovery of this powerful theorem, it has found numerous applications in a variety of areas, but, despite attracting a lot of attention, the exact value of the Grothendieck constant remains a mystery. The last progress on this problem was in 1977, when Krivine proved that ⩽/2log(1+2‾√) and conjectured that his bound is optimal. Krivine’s conjecture has been restated repeatedly since 1977, resulting in focusing the subsequent research on the search for examples of matrices () which exhibit (asymptotically, as ,→∞ ) a lower bound on that matches Krivine’s bound. Here, we obtain an improved Grothendieck inequality that holds for all matrices () and yields a bound 0 . Other than disproving Krivine’s conjecture, and along the way also disproving an intermediate conjecture of König that was made in 2000 as a step towards Krivine’s conjecture, our main contribution is conceptual: despite dealing with a binary rounding problem, random two-dimensional projections, when combined with a careful partition of ℝ2 in order to round the projected vectors to values in {−1,1} , perform better than the ubiquitous random hyperplane technique. By establishing the usefulness of higher-dimensional rounding schemes, this fact has consequences in approximation algorithms. Specifically, it yields the best known polynomial-time approximation algorithm for the Frieze–Kannan Cut Norm problem, a generic and well-studied optimization problem with many applications