600 research outputs found

    The Davidon-Fletcher-Powell penalty function method: A generalized iterative technique for solving parameter optimization problems

    Get PDF
    The Fletcher-Powell version of the Davidon variable metric unconstrained minimization technique is described. Equations that have been used successfully with the Davidon-Fletcher-Powell penalty function technique for solving constrained minimization problems and the advantages and disadvantages of using them are discussed. The experience gained in the behavior of the method while iterating is also related

    An acceleration technique for a conjugate direction algorithm for nonlinear regression

    Get PDF
    A linear acceleration technique, LAT, is developed which is applied to three conjugate direction algorithms: (1) Fletcher-Reeves algorithm, (2) Davidon-Fletcher-Powell algorithm and (3) Grey\u27s Orthonormal Optimization Procedure (GOOP). Eight problems are solved by the three algorithms mentioned above and the Levenberg-Marquardt algorithm. The addition of the LAT algorithm improves the rate of convergence for the GOOP algorithm in all problems attempted and for some problems using the Fletcher-Reeves algorithm and the Davidon-Fletcher-Powell algorithm. Using the number of operations to perform function and derivative evaluations, the algorithms mentioned above are compared. Although the GOOP algorithm is relatively unknown outside of the optics literature, it was found to be competitive with the other successful algorithms. A proof of convergence of the accelerated GOOP algorithm for nonquadratic problems is also developed --Abstract, page ii

    On the convergence of a class of variable metric algorithms

    Get PDF

    Fitting aerodynamic forces in the Laplace domain: An application of a nonlinear nongradient technique to multilevel constrained optimization

    Get PDF
    A technique which employs both linear and nonlinear methods in a multilevel optimization structure to best approximate generalized unsteady aerodynamic forces for arbitrary motion is described. Optimum selection of free parameters is made in a rational function approximation of the aerodynamic forces in the Laplace domain such that a best fit is obtained, in a least squares sense, to tabular data for purely oscillatory motion. The multilevel structure and the corresponding formulation of the objective models are presented which separate the reduction of the fit error into linear and nonlinear problems, thus enabling the use of linear methods where practical. Certain equality and inequality constraints that may be imposed are identified; a brief description of the nongradient, nonlinear optimizer which is used is given; and results which illustrate application of the method are presented

    Synthesis of stiffened shells of revolution

    Get PDF
    Computer programs for the synthesis of shells of various configurations were developed. The conditions considered are: (1) uniform shells (mainly cones) using a membrane buckling analysis, (2) completely uniform shells (cones, spheres, toroidal segments) using linear bending prebuckling analysis, and (3) revision of second design process to reduce the number of design variables to about 30 by considering piecewise uniform designs. A perturbation formula was derived and this allows exact derivatives of the general buckling load to be computed with little additional computer time

    Learning Algorithms for Connectionist Networks: Applied Gradient Methods of Nonlinear Optimization

    Get PDF
    The problem of learning using connectionist networks, in which network connection strengths are modified systematically so that the response of the network increasingly approximates the desired response can be structured as an optimization problem. The widely used back propagation method of connectionist learning [19, 21, 18] is set in the context of nonlinear optimization. In this framework, the issues of stability, convergence and parallelism are considered. As a form of gradient descent with fixed step size, back propagation is known to be unstable, which is illustrated using Rosenbrock\u27s function. This is contrasted with stable methods which involve a line search in the gradient direction. The convergence criterion for connectionist problems involving binary functions is discussed relative to the behavior of gradient descent in the vicinity of local minima. A minimax criterion is compared with the least squares criterion. The contribution of the momentum term [19, 18] to more rapid convergence is interpreted relative to the geometry of the weight space. It is shown that in plateau regions of relatively constant gradient, the momentum term acts to increase the step size by a factor of 1/1-μ, where μ is the momentum term. In valley regions with steep sides, the momentum constant acts to focus the search direction toward the local minimum by averaging oscillations in the gradient
    corecore