3,286 research outputs found

    Limited-memory BFGS Systems with Diagonal Updates

    Get PDF
    In this paper, we investigate a formula to solve systems of the form (B + {\sigma}I)x = y, where B is a limited-memory BFGS quasi-Newton matrix and {\sigma} is a positive constant. These types of systems arise naturally in large-scale optimization such as trust-region methods as well as doubly-augmented Lagrangian methods. We show that provided a simple condition holds on B_0 and \sigma, the system (B + \sigma I)x = y can be solved via a recursion formula that requies only vector inner products. This formula has complexity M^2n, where M is the number of L-BFGS updates and n >> M is the dimension of x

    Improved Diagonal Hessian Approximations for Large-Scale Unconstrained Optimization

    Get PDF
    We consider some diagonal quasi-Newton methods for solving large-scale unconstrained optimization problems. A simple and effective approach for diagonal quasi-Newton algorithms is presented by proposing new updates of diagonal entries of the Hessian. Moreover, we suggest employing an extra BFGS update of the diagonal updating matrix and use its diagonal again. Numerical experiments on a collection of standard test problems show, in particular, that the proposed diagonal quasi-Newton methods perform substantially better than certain available diagonal methods

    Numerical optimization design of advanced transonic wing configurations

    Get PDF
    A computationally efficient and versatile technique for use in the design of advanced transonic wing configurations has been developed. A reliable and fast transonic wing flow-field analysis program, TWING, has been coupled with a modified quasi-Newton method, unconstrained optimization algorithm, QNMDIF, to create a new design tool. Fully three-dimensional wing designs utilizing both specified wing pressure distributions and drag-to-lift ration minimization as design objectives are demonstrated. Because of the high computational efficiency of each of the components of the design code, in particular the vectorization of TWING and the high speed of the Cray X-MP vector computer, the computer time required for a typical wing design is reduced by approximately an order of magnitude over previous methods. In the results presented here, this computed wave drag has been used as the quantity to be optimized (minimized) with great success, yielding wing designs with nearly shock-free (zero wave drag) pressure distributions and very reasonable wing section shapes

    Enhancing structure relaxations for first-principles codes: an approximate Hessian approach

    Get PDF
    We present a method for improving the speed of geometry relaxation by using a harmonic approximation for the interaction potential between nearest neighbor atoms to construct an initial Hessian estimate. The model is quite robust, and yields approximately a 30% or better reduction in the number of calculations compared to an optimized diagonal initialization. Convergence with this initializer approaches the speed of a converged BFGS Hessian, therefore it is close to the best that can be achieved. Hessian preconditioning is discussed, and it is found that a compromise between an average condition number and a narrow distribution in eigenvalues produces the best optimization.Comment: 9 pages, 3 figures, added references, expanded optimization sectio

    Three-step fixed-point quasi-Newtonmethods for unconstrained optimisation

    Get PDF
    AbstractMultistep quasi-Newton methods were introduced by Ford and Moghrabi [1]. They address the problem of the unconstrained minimisation of a function f:ℝn→ℝ whose gradient and Hessian are denoted by g and G, respectively. These methods generalised the standard construction of quasi-Newton methods and were based on employing interpolatory polynomials to utilise information from more than one previous step. In a series of papers, Ford and Moghrabi [2–5] have developed various techniques for determining the parametrisation of the interpolating curves. In [2], they introduced two-step metric-based methods which determine the set of parameter values required through measuring distances between various pairs of the iterates employed in the current interpolation. One of the most successful methods in [2] was found to be in the “fixed-point” class, in which the parametrisation of the interpolating curve is determined, at each iteration, by reference to distances measured from a fixed iterate.As suggested in [1], multistep quasi-Newton methods can be constructed for any number of steps.In this paper, we therefore extend the previous work by describing the development of some three-step methods which use the “fixed-point” approach and data derived from the latest four iterates. The experimental results provide evidence that the new methods offer a significant improvement in performance when compared with the standard BFGS method and the unit-spaced three-step method, particularly as the dimension of the test problems grows

    Unifying Optimization Algorithms to Aid Software System Users: optimx for R

    Get PDF
    R users can often solve optimization tasks easily using the tools in the optim function in the stats package provided by default on R installations. However, there are many other optimization and nonlinear modelling tools in R or in easily installed add-on packages. These present users with a bewildering array of choices. optimx is a wrapper to consolidate many of these choices for the optimization of functions that are mostly smooth with parameters at most bounds-constrained. We attempt to provide some diagnostic information about the function, its scaling and parameter bounds, and the solution characteristics. optimx runs a battery of methods on a given problem, thus facilitating comparative studies of optimization algorithms for the problem at hand. optimx can also be a useful pedagogical tool for demonstrating the strengths and pitfalls of different classes of optimization approaches including Newton, gradient, and derivative-free methods
    • …
    corecore