28 research outputs found

    Automatic adjoint differentiation for gradient descent and model calibration

    Get PDF
    In this work, we discuss the Automatic Adjoint Differentiation (AAD) for functions of the form G=12∑m1(Eyi−Ci)2, which often appear in the calibration of stochastic models. We demonstrate that it allows a perfect SIMDa parallelization and provides its relative computational cost. In addition, we demonstrate that this theoretical result is in concordance with numerical experiments. a Single Input Multiple Data.publishe

    AAD: breaking the primal barrier

    Get PDF
    In this article we present a new approach for automatic adjoint differentiation (AAD) with a special focus on computations where derivatives ∂F(X) ∂X are required for multiple instances of vectors X. In practice, the presented approach is able to calculate all the differentials faster than the primal (original) C++ program for F.publishe

    Global solution of the initial value problem for the focusing Davey-Stewartson II system

    Full text link
    We consider the two dimensional focusing Davey-Stewartson II system and construct the global solution of the Cauchy problem for a dense in L2(C)L^2(\mathbb C) set of initial data. We do not assume that the initial data is small. So, the solutions may have singularities. We show that the blow-up may occur only on a real analytic variety and the variety is bounded in each strip t≤Tt \leq T
    corecore