We present a polynomial time algorithm to approximately scale tensors of any
format to arbitrary prescribed marginals (whenever possible). This unifies and
generalizes a sequence of past works on matrix, operator and tensor scaling.
Our algorithm provides an efficient weak membership oracle for the associated
moment polytopes, an important family of implicitly-defined convex polytopes
with exponentially many facets and a wide range of applications. These include
the entanglement polytopes from quantum information theory (in particular, we
obtain an efficient solution to the notorious one-body quantum marginal
problem) and the Kronecker polytopes from representation theory (which capture
the asymptotic support of Kronecker coefficients). Our algorithm can be applied
to succinct descriptions of the input tensor whenever the marginals can be
efficiently computed, as in the important case of matrix product states or
tensor-train decompositions, widely used in computational physics and numerical
mathematics.
We strengthen and generalize the alternating minimization approach of
previous papers by introducing the theory of highest weight vectors from
representation theory into the numerical optimization framework. We show that
highest weight vectors are natural potential functions for scaling algorithms
and prove new bounds on their evaluations to obtain polynomial-time
convergence. Our techniques are general and we believe that they will be
instrumental to obtain efficient algorithms for moment polytopes beyond the
ones consider here, and more broadly, for other optimization problems
possessing natural symmetries