15,407 research outputs found
Sketched Newton-Raphson
We propose a new globally convergent stochastic second order method. Our
starting point is the development of a new Sketched Newton-Raphson (SNR) method
for solving large scale nonlinear equations of the form with
. We then show how to design several
stochastic second order optimization methods by re-writing the optimization
problem of interest as a system of nonlinear equations and applying SNR. For
instance, by applying SNR to find a stationary point of a generalized linear
model (GLM), we derive completely new and scalable stochastic second order
methods. We show that the resulting method is very competitive as compared to
state-of-the-art variance reduced methods. Furthermore, using a variable
splitting trick, we also show that the Stochastic Newton method (SNM) is a
special case of SNR, and use this connection to establish the first global
convergence theory of SNM.
We establish the global convergence of SNR by showing that it is a variant of
the stochastic gradient descent (SGD) method, and then leveraging proof
techniques of SGD. As a special case, our theory also provides a new global
convergence theory for the original Newton-Raphson method under strictly weaker
assumptions as compared to the classic monotone convergence theory.Comment: Accepted for SIAM Journal on Optimization. 47 pages, 4 figure
- β¦