12 research outputs found
Convergence Analysis of Accelerated Stochastic Gradient Descent under the Growth Condition
We study the convergence of accelerated stochastic gradient descent for
strongly convex objectives under the growth condition, which states that the
variance of stochastic gradient is bounded by a multiplicative part that grows
with the full gradient, and a constant additive part. Through the lens of the
growth condition, we investigate four widely used accelerated methods:
Nesterov's accelerated method (NAM), robust momentum method (RMM), accelerated
dual averaging method (ADAM), and implicit ADAM (iADAM). While these methods
are known to improve the convergence rate of SGD under the condition that the
stochastic gradient has bounded variance, it is not well understood how their
convergence rates are affected by the multiplicative noise. In this paper, we
show that these methods all converge to a neighborhood of the optimum with
accelerated convergence rates (compared to SGD) even under the growth
condition. In particular, NAM, RMM, iADAM enjoy acceleration only with a mild
multiplicative noise, while ADAM enjoys acceleration even with a large
multiplicative noise. Furthermore, we propose a generic tail-averaged scheme
that allows the accelerated rates of ADAM and iADAM to nearly attain the
theoretical lower bound (up to a logarithmic factor in the variance term)
Differentially Private Accelerated Optimization Algorithms
We present two classes of differentially private optimization algorithms
derived from the well-known accelerated first-order methods. The first
algorithm is inspired by Polyak's heavy ball method and employs a smoothing
approach to decrease the accumulated noise on the gradient steps required for
differential privacy. The second class of algorithms are based on Nesterov's
accelerated gradient method and its recent multi-stage variant. We propose a
noise dividing mechanism for the iterations of Nesterov's method in order to
improve the error behavior of the algorithm. The convergence rate analyses are
provided for both the heavy ball and the Nesterov's accelerated gradient method
with the help of the dynamical system analysis techniques. Finally, we conclude
with our numerical experiments showing that the presented algorithms have
advantages over the well-known differentially private algorithms.Comment: 28 pages, 4 figure