31 research outputs found

    Nonlinear optimization in Hilbert space using Sobolev gradients with applications

    Full text link
    The problem of finding roots or solutions of a nonlinear partial differential equation may be formulated as the problem of minimizing a sum of squared residuals. One then defines an evolution equation so that in the asymptotic limit a minimizer, and often a solution of the PDE, is obtained. The corresponding discretized nonlinear least squares problem is an often met problem in the field of numerical optimization, and thus there exist a wide variety of methods for solving such problems. We review here Newton's method from nonlinear optimization both in a discrete and continuous setting and present results of a similar nature for the Levernberg-Marquardt method. We apply these results to the Ginzburg-Landau model of superconductivity

    Algorithm 893

    No full text

    Algorithm 661

    No full text

    Algorithm 624: Triangulation and Interpolation at Arbitrarily Distributed Points in the Plane

    No full text

    Algorithm 834

    No full text

    Remark on Algorithm 751

    No full text

    Algorithm 660: QSHEP2D: Quadratic Shepard Method for Bivariate Interpolation of Scattered Data

    No full text

    Algorithm 790: CSHEP2D

    No full text

    Remark on Algorithm 752

    No full text
    corecore