7,839 research outputs found

    Alternating Randomized Block Coordinate Descent

    Full text link
    Block-coordinate descent algorithms and alternating minimization methods are fundamental optimization algorithms and an important primitive in large-scale optimization and machine learning. While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization -- which applies to the setting of only two blocks -- is known to have convergence time that scales independently of the least smooth block. A natural question is then: is the setting of two blocks special? We show that the answer is "no" as long as the least smooth block can be optimized exactly -- an assumption that is also needed in the setting of alternating minimization. We do so by introducing a novel algorithm AR-BCD, whose convergence time scales independently of the least smooth (possibly non-smooth) block. The basic algorithm generalizes both alternating minimization and randomized block coordinate (gradient) descent, and we also provide its accelerated version -- AAR-BCD. As a special case of AAR-BCD, we obtain the first nontrivial accelerated alternating minimization algorithm.Comment: Version 1 appeared Proc. ICML'18. v1 -> v2: added remarks about how accelerated alternating minimization follows directly from the results that appeared in ICML'18; no new technical results were needed for thi

    On a generalization of iterated and randomized rounding

    Get PDF
    We give a general method for rounding linear programs that combines the commonly used iterated rounding and randomized rounding techniques. In particular, we show that whenever iterated rounding can be applied to a problem with some slack, there is a randomized procedure that returns an integral solution that satisfies the guarantees of iterated rounding and also has concentration properties. We use this to give new results for several classic problems where iterated rounding has been useful
    • …
    corecore