40 research outputs found

    On the fast convergence of minibatch heavy ball momentum

    Full text link
    Simple stochastic momentum methods are widely used in machine learning optimization, but their good practical performance is at odds with an absence of theoretical guarantees of acceleration in the literature. In this work, we aim to close the gap between theory and practice by showing that stochastic heavy ball momentum retains the fast linear rate of (deterministic) heavy ball momentum on quadratic optimization problems, at least when minibatching with a sufficiently large batch size. The algorithm we study can be interpreted as an accelerated randomized Kaczmarz algorithm with minibatching and heavy ball momentum. The analysis relies on carefully decomposing the momentum transition matrix, and using new spectral norm concentration bounds for products of independent random matrices. We provide numerical illustrations demonstrating that our bounds are reasonably sharp

    An Asynchronous Parallel Randomized Kaczmarz Algorithm

    Full text link
    We describe an asynchronous parallel variant of the randomized Kaczmarz (RK) algorithm for solving the linear system Ax=bAx=b. The analysis shows linear convergence and indicates that nearly linear speedup can be expected if the number of processors is bounded by a multiple of the number of rows in AA
    corecore