4,987 research outputs found
On limited-memory quasi-Newton methods for minimizing a quadratic function
The main focus in this paper is exact linesearch methods for minimizing a
quadratic function whose Hessian is positive definite. We give two classes of
limited-memory quasi-Newton Hessian approximations that generate search
directions parallel to those of the method of preconditioned conjugate
gradients, and hence give finite termination on quadratic optimization
problems. The Hessian approximations are described by a novel compact
representation which provides a dynamical framework. We also discuss possible
extensions of these classes and show their behavior on randomly generated
quadratic optimization problems. The methods behave numerically similar to
L-BFGS. Inclusion of information from the first iteration in the limited-memory
Hessian approximation and L-BFGS significantly reduces the effects of round-off
errors on the considered problems. In addition, we give our compact
representation of the Hessian approximations in the full Broyden class for the
general unconstrained optimization problem. This representation consists of
explicit matrices and gradients only as vector components
Probabilistic Interpretation of Linear Solvers
This manuscript proposes a probabilistic framework for algorithms that
iteratively solve unconstrained linear problems with positive definite
for . The goal is to replace the point estimates returned by existing
methods with a Gaussian posterior belief over the elements of the inverse of
, which can be used to estimate errors. Recent probabilistic interpretations
of the secant family of quasi-Newton optimization algorithms are extended.
Combined with properties of the conjugate gradient algorithm, this leads to
uncertainty-calibrated methods with very limited cost overhead over conjugate
gradients, a self-contained novel interpretation of the quasi-Newton and
conjugate gradient algorithms, and a foundation for new nonlinear optimization
methods.Comment: final version, in press at SIAM J Optimizatio
Approximation of sequences of symmetric matrices with the symmetric rank-one algorithm and applications
The symmetric rank-one update method is well-known in optimization for its
applications in the quasi-Newton algorithm. In particular, Conn, Gould, and
Toint proved in 1991 that the matrix sequence resulting from this method
approximates the Hessian of the minimized function. Expanding their idea, we
prove that the symmetric rank-one update algorithm can be used to approximate
any sequence of symmetric invertible matrices, thereby adding a variety of
applications to more general problems, such as the computation of constrained
geodesics in shape analysis imaging problems. We also provide numerical
simulations for the method and some of these applications.Comment: 11 page
A symmetric rank-one Quasi-Newton line-search method using negative curvature directions
We propose a quasi-Newton line-search method that uses negative curvature directions for solving unconstrained optimization problems. In this method, the symmetric rank-one (SR1) rule is used to update the Hessian approximation. The SR1 update rule is known to have a good numerical performance; however, it does not guarantee positive definiteness of the updated matrix. We first discuss the details of the proposed algorithm and then concentrate on its numerical efficiency. Our extensive computational study shows the potential of the proposed method from different angles, such as; its second order convergence behavior, its exceeding performance when compared to two other existing packages, and its computation profile illustrating the possible bottlenecks in the execution time. We then conclude the paper with the convergence analysis of the proposed method
- …