6,854 research outputs found

    Alternating Randomized Block Coordinate Descent

    Full text link
    Block-coordinate descent algorithms and alternating minimization methods are fundamental optimization algorithms and an important primitive in large-scale optimization and machine learning. While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization -- which applies to the setting of only two blocks -- is known to have convergence time that scales independently of the least smooth block. A natural question is then: is the setting of two blocks special? We show that the answer is "no" as long as the least smooth block can be optimized exactly -- an assumption that is also needed in the setting of alternating minimization. We do so by introducing a novel algorithm AR-BCD, whose convergence time scales independently of the least smooth (possibly non-smooth) block. The basic algorithm generalizes both alternating minimization and randomized block coordinate (gradient) descent, and we also provide its accelerated version -- AAR-BCD. As a special case of AAR-BCD, we obtain the first nontrivial accelerated alternating minimization algorithm.Comment: Version 1 appeared Proc. ICML'18. v1 -> v2: added remarks about how accelerated alternating minimization follows directly from the results that appeared in ICML'18; no new technical results were needed for thi
    • …
    corecore