12,923 research outputs found
A new, globally convergent Riemannian conjugate gradient method
This article deals with the conjugate gradient method on a Riemannian
manifold with interest in global convergence analysis. The existing conjugate
gradient algorithms on a manifold endowed with a vector transport need the
assumption that the vector transport does not increase the norm of tangent
vectors, in order to confirm that generated sequences have a global convergence
property. In this article, the notion of a scaled vector transport is
introduced to improve the algorithm so that the generated sequences may have a
global convergence property under a relaxed assumption. In the proposed
algorithm, the transported vector is rescaled in case its norm has increased
during the transport. The global convergence is theoretically proved and
numerically observed with examples. In fact, numerical experiments show that
there exist minimization problems for which the existing algorithm generates
divergent sequences, but the proposed algorithm generates convergent sequences.Comment: 22 pages, 8 figure
Conjugate gradient algorithms and the Galerkin boundary element method
Original article can be found at: http://www.sciencedirect.com/science/journal/08981221 Copyright Elsevier Ltd. DOI: 10.1016/j.camwa.2004.02.002Peer reviewe
On Quasi-Newton Forward--Backward Splitting: Proximal Calculus and Convergence
We introduce a framework for quasi-Newton forward--backward splitting
algorithms (proximal quasi-Newton methods) with a metric induced by diagonal
rank- symmetric positive definite matrices. This special type of
metric allows for a highly efficient evaluation of the proximal mapping. The
key to this efficiency is a general proximal calculus in the new metric. By
using duality, formulas are derived that relate the proximal mapping in a
rank- modified metric to the original metric. We also describe efficient
implementations of the proximity calculation for a large class of functions;
the implementations exploit the piece-wise linear nature of the dual problem.
Then, we apply these results to acceleration of composite convex minimization
problems, which leads to elegant quasi-Newton methods for which we prove
convergence. The algorithm is tested on several numerical examples and compared
to a comprehensive list of alternatives in the literature. Our quasi-Newton
splitting algorithm with the prescribed metric compares favorably against
state-of-the-art. The algorithm has extensive applications including signal
processing, sparse recovery, machine learning and classification to name a few.Comment: arXiv admin note: text overlap with arXiv:1206.115
- …