47,793 research outputs found
Automatic Differentiation Tools in Optimization Software
We discuss the role of automatic differentiation tools in optimization
software. We emphasize issues that are important to large-scale optimization
and that have proved useful in the installation of nonlinear solvers in the
NEOS Server. Our discussion centers on the computation of the gradient and
Hessian matrix for partially separable functions and shows that the gradient
and Hessian matrix can be computed with guaranteed bounds in time and memory
requirementsComment: 11 page
Monte Carlo evaluation of sensitivities in computational finance
In computational finance, Monte Carlo simulation is used to compute the correct prices for financial options. More important, however, is the ability to compute the so-called "Greeks'', the first and second order derivatives of the prices with respect to input parameters such as the current asset price, interest rate and level of volatility.\ud
\ud
This paper discusses the three main approaches to computing Greeks: finite difference, likelihood ratio method (LRM) and pathwise sensitivity calculation. The last of these has an adjoint implementation with a computational cost which is independent of the number of first derivatives to be calculated. We explain how the practical development of adjoint codes is greatly assisted by using Algorithmic Differentiation, and in particular discuss the performance achieved by the FADBAD++ software package which is based on templates and operator overloading within C++.\ud
\ud
The pathwise approach is not applicable when the financial payoff function is not differentiable, and even when the payoff is differentiable, the use of scripting in real-world implementations means it can be very difficult in practice to evaluate the derivative of very complex financial products. A new idea is presented to address these limitations by combining the adjoint pathwise approach for the stochastic path evolution with LRM for the payoff evaluation
Automatic Differentiation of Rigid Body Dynamics for Optimal Control and Estimation
Many algorithms for control, optimization and estimation in robotics depend
on derivatives of the underlying system dynamics, e.g. to compute
linearizations, sensitivities or gradient directions. However, we show that
when dealing with Rigid Body Dynamics, these derivatives are difficult to
derive analytically and to implement efficiently. To overcome this issue, we
extend the modelling tool `RobCoGen' to be compatible with Automatic
Differentiation. Additionally, we propose how to automatically obtain the
derivatives and generate highly efficient source code. We highlight the
flexibility and performance of the approach in two application examples. First,
we show a Trajectory Optimization example for the quadrupedal robot HyQ, which
employs auto-differentiation on the dynamics including a contact model. Second,
we present a hardware experiment in which a 6 DoF robotic arm avoids a randomly
moving obstacle in a go-to task by fast, dynamic replanning
Users Guide for SnadiOpt: A Package Adding Automatic Differentiation to Snopt
SnadiOpt is a package that supports the use of the automatic differentiation
package ADIFOR with the optimization package Snopt. Snopt is a general-purpose
system for solving optimization problems with many variables and constraints.
It minimizes a linear or nonlinear function subject to bounds on the variables
and sparse linear or nonlinear constraints. It is suitable for large-scale
linear and quadratic programming and for linearly constrained optimization, as
well as for general nonlinear programs. The method used by Snopt requires the
first derivatives of the objective and constraint functions to be available.
The SnadiOpt package allows users to avoid the time-consuming and error-prone
process of evaluating and coding these derivatives. Given Fortran code for
evaluating only the values of the objective and constraints, SnadiOpt
automatically generates the code for evaluating the derivatives and builds the
relevant Snopt input files and sparse data structures.Comment: pages i-iv, 1-2
- …