1 research outputs found

    Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning

    Full text link
    We propose accelerated randomized coordinate descent algorithms for stochastic optimization and online learning. Our algorithms have significantly less per-iteration complexity than the known accelerated gradient algorithms. The proposed algorithms for online learning have better regret performance than the known randomized online coordinate descent algorithms. Furthermore, the proposed algorithms for stochastic optimization exhibit as good convergence rates as the best known randomized coordinate descent algorithms. We also show simulation results to demonstrate performance of the proposed algorithms.Comment: 20 pages, 4 figures, 2 table
    corecore