14 research outputs found
Limited-memory BFGS Systems with Diagonal Updates
In this paper, we investigate a formula to solve systems of the form (B +
{\sigma}I)x = y, where B is a limited-memory BFGS quasi-Newton matrix and
{\sigma} is a positive constant. These types of systems arise naturally in
large-scale optimization such as trust-region methods as well as
doubly-augmented Lagrangian methods. We show that provided a simple condition
holds on B_0 and \sigma, the system (B + \sigma I)x = y can be solved via a
recursion formula that requies only vector inner products. This formula has
complexity M^2n, where M is the number of L-BFGS updates and n >> M is the
dimension of x
On Solving L-SR1 Trust-Region Subproblems
In this article, we consider solvers for large-scale trust-region subproblems
when the quadratic model is defined by a limited-memory symmetric rank-one
(L-SR1) quasi-Newton matrix. We propose a solver that exploits the compact
representation of L-SR1 matrices. Our approach makes use of both an orthonormal
basis for the eigenspace of the L-SR1 matrix and the Sherman-Morrison-Woodbury
formula to compute global solutions to trust-region subproblems. To compute the
optimal Lagrange multiplier for the trust-region constraint, we use Newton's
method with a judicious initial guess that does not require safeguarding. A
crucial property of this solver is that it is able to compute high-accuracy
solutions even in the so-called hard case. Additionally, the optimal solution
is determined directly by formula, not iteratively. Numerical experiments
demonstrate the effectiveness of this solver.Comment: 2015-0
Shape-Changing Trust-Region Methods Using Multipoint Symmetric Secant Matrices
In this work, we consider methods for large-scale and nonconvex unconstrained
optimization. We propose a new trust-region method whose subproblem is defined
using a so-called "shape-changing" norm together with densely-initialized
multipoint symmetric secant (MSS) matrices to approximate the Hessian.
Shape-changing norms and dense initializations have been successfully used in
the context of traditional quasi-Newton methods, but have yet to be explored in
the case of MSS methods. Numerical results suggest that trust-region methods
that use densely-initialized MSS matrices together with shape-changing norms
outperform MSS with other trust-region methods
Iterative methods for large-scale unconstrained optimization
An unconstrained minimizer of a general nonlinear function may be found by solving a sequence of constrained subproblems in which a quadratic model function is minimized subject to a "trust-region" constraint on the norm of the change in variables. For the large-scale case, Steihaug has proposed an iterative method for the constrained subproblem based on the preconditioned conjugate-gradient (PCG) method. This method is terminated inside the trust region at an approximate minimizer or at the point where the iterates cross the trust-region boundary. When the iterates are terminated at the trust- region boundary, the final iterate is generally an inaccurate solution of the constrained subproblem. This may have an adverse affect on the efficiency and robustness of the overall trust-region method. A PCG-based method is proposed that may be used to solve the trust- region subproblem to any prescribed accuracy. The method starts by using a modified Steihaug method. If the solution lies on the trust-region boundary, a PCG-based sequential subspace minimization (SSM) method is used to solve the constrained problem over a sequence of evolving low-dimensional subspaces. A new regularized sequential Newton method is used to define basis vectors for the subspace minimization. Several preconditioners are proposed for the PCG iterations. Numerical results suggest that, in general, a trust-region method based on the proposed solver is more robust and requires fewer function evaluations than Steihaug's metho
Recommended from our members
Iterative methods for large-scale unconstrained optimization
An unconstrained minimizer of a general nonlinear function may be found by solving a sequence of constrained subproblems in which a quadratic model function is minimized subject to a "trust-region" constraint on the norm of the change in variables. For the large-scale case, Steihaug has proposed an iterative method for the constrained subproblem based on the preconditioned conjugate-gradient (PCG) method. This method is terminated inside the trust region at an approximate minimizer or at the point where the iterates cross the trust-region boundary. When the iterates are terminated at the trust- region boundary, the final iterate is generally an inaccurate solution of the constrained subproblem. This may have an adverse affect on the efficiency and robustness of the overall trust-region method. A PCG-based method is proposed that may be used to solve the trust- region subproblem to any prescribed accuracy. The method starts by using a modified Steihaug method. If the solution lies on the trust-region boundary, a PCG-based sequential subspace minimization (SSM) method is used to solve the constrained problem over a sequence of evolving low-dimensional subspaces. A new regularized sequential Newton method is used to define basis vectors for the subspace minimization. Several preconditioners are proposed for the PCG iterations. Numerical results suggest that, in general, a trust-region method based on the proposed solver is more robust and requires fewer function evaluations than Steihaug's metho