604,724 research outputs found
A Stein variational Newton method
Stein variational gradient descent (SVGD) was recently proposed as a general
purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]:
it minimizes the Kullback-Leibler divergence between the target distribution
and its approximation by implementing a form of functional gradient descent on
a reproducing kernel Hilbert space. In this paper, we accelerate and generalize
the SVGD algorithm by including second-order information, thereby approximating
a Newton-like iteration in function space. We also show how second-order
information can lead to more effective choices of kernel. We observe
significant computational gains over the original SVGD algorithm in multiple
test cases.Comment: 18 pages, 7 figure
A Framework for Generalising the Newton Method and Other Iterative Methods from Euclidean Space to Manifolds
The Newton iteration is a popular method for minimising a cost function on
Euclidean space. Various generalisations to cost functions defined on manifolds
appear in the literature. In each case, the convergence rate of the generalised
Newton iteration needed establishing from first principles. The present paper
presents a framework for generalising iterative methods from Euclidean space to
manifolds that ensures local convergence rates are preserved. It applies to any
(memoryless) iterative method computing a coordinate independent property of a
function (such as a zero or a local minimum). All possible Newton methods on
manifolds are believed to come under this framework. Changes of coordinates,
and not any Riemannian structure, are shown to play a natural role in lifting
the Newton method to a manifold. The framework also gives new insight into the
design of Newton methods in general.Comment: 36 page
Fractional Newton-Raphson Method Accelerated with Aitken's Method
The Newton-Raphson (N-R) method is characterized by the fact that generating
a divergent sequence can lead to the creation of a fractal, on the other hand
the order of the fractional derivatives seems to be closely related to the
fractal dimension, based on the above, a method was developed that makes use of
the N-R method and the fractional derivative of Riemann-Liouville (R-L) that
has been named as the Fractional Newton-Raphson (F N-R) method.
In the following work we present a way to obtain the convergence of the F N-R
method, which seems to be at least linearly convergent for the case where the
order of the derivative is different from one, a simplified way to
construct the fractional derivative and fractional integral operators of R-L is
presented, an introduction to the Aitken's method is made and it is explained
why it has the capacity to accelerate the convergence of iterative methods to
finally present the results that were obtained when implementing the Aitken's
method in F N-R method.Comment: Newton-Raphson Method, Fractional Calculus, Fractional Derivative of
Riemann-Liouville, Method of Aitken. arXiv admin note: substantial text
overlap with arXiv:1710.0763
A geometric Newton method for Oja's vector field
Newton's method for solving the matrix equation runs
up against the fact that its zeros are not isolated. This is due to a symmetry
of by the action of the orthogonal group. We show how
differential-geometric techniques can be exploited to remove this symmetry and
obtain a ``geometric'' Newton algorithm that finds the zeros of . The
geometric Newton method does not suffer from the degeneracy issue that stands
in the way of the original Newton method
A quasi-Newton proximal splitting method
A new result in convex analysis on the calculation of proximity operators in
certain scaled norms is derived. We describe efficient implementations of the
proximity calculation for a useful class of functions; the implementations
exploit the piece-wise linear nature of the dual problem. The second part of
the paper applies the previous result to acceleration of convex minimization
problems, and leads to an elegant quasi-Newton method. The optimization method
compares favorably against state-of-the-art alternatives. The algorithm has
extensive applications including signal processing, sparse recovery and machine
learning and classification
- …
