Recently, many variance reduced stochastic alternating direction method of
multipliers (ADMM) methods (e.g.\ SAG-ADMM, SDCA-ADMM and SVRG-ADMM) have made
exciting progress such as linear convergence rates for strongly convex
problems. However, the best known convergence rate for general convex problems
is O(1/T) as opposed to O(1/T^2) of accelerated batch algorithms, where T is
the number of iterations. Thus, there still remains a gap in convergence rates
between existing stochastic ADMM and batch algorithms. To bridge this gap, we
introduce the momentum acceleration trick for batch optimization into the
stochastic variance reduced gradient based ADMM (SVRG-ADMM), which leads to an
accelerated (ASVRG-ADMM) method. Then we design two different momentum term
update rules for strongly convex and general convex cases. We prove that
ASVRG-ADMM converges linearly for strongly convex problems. Besides having a
low per-iteration complexity as existing stochastic ADMM methods, ASVRG-ADMM
improves the convergence rate on general convex problems from O(1/T) to
O(1/T^2). Our experimental results show the effectiveness of ASVRG-ADMM.Comment: 16 pages, 5 figures, Appears in Proceedings of the 31th AAAI
Conference on Artificial Intelligence (AAAI), San Francisco, California, USA,
pp. 2287--2293, 201