5 research outputs found

    Optimizing over the Growing Spectrahedron

    Full text link

    New Analysis and Results for the Conditional Gradient Method

    Get PDF
    We present new results for the conditional gradient method (also known as the Frank-Wolfe method). We derive computational guarantees for arbitrary step-size sequences, which are then applied to various step-size rules, including simple averaging and constant step-sizes. We also develop step-size rules and computational guarantees that depend naturally on the warm-start quality of the initial (and subsequent) iterates. Our results include computational guarantees for both duality/bound gaps and the so-called Wolfe gaps. Lastly, we present complexity bounds in the presence of approximate computation of gradients and/or linear optimization subproblem solutions.

    Optimizing over the Growing Spectrahedron

    No full text
    corecore