4 research outputs found

    Automatic Differentiation Tools in Optimization Software

    Full text link
    We discuss the role of automatic differentiation tools in optimization software. We emphasize issues that are important to large-scale optimization and that have proved useful in the installation of nonlinear solvers in the NEOS Server. Our discussion centers on the computation of the gradient and Hessian matrix for partially separable functions and shows that the gradient and Hessian matrix can be computed with guaranteed bounds in time and memory requirementsComment: 11 page

    Diferenciação computacional e aplicações

    Get PDF
    Orientador: Jose Mario MartinezTese (doutorado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação CientificaResumo: Não informadoAbstract: Not informedDoutoradoDoutor em Matemática Aplicad

    Algorithms and Design for a Second-Order Automatic Differentiation Module

    No full text
    This article describes approaches to computing second-order derivatives with automatic differentiation (AD) based on the forward mode and the propagation of univariate Taylor series. Performance results are given that show the speedup possible with these techniques relative to existing approaches. We also describe a new source transformation AD module for computing second-order derivatives of C and Fortran codes and the underlying infrastructure used to create a language-independent translation tool. 1 Introduction Automatic differentiation (AD) provides an efficient and accurate method to obtain derivatives for use in sensitivity analysis, parameter identification and optimization. Current tools are targeted primarily at computing first-order derivatives, namely gradients and Jacobians. Prior to AD, derivative values were obtained through divided difference methods, symbolic manipulation or hand-coding, all of which have drawbacks when compared with AD (see [4] for a dis- This wo..
    corecore