36 research outputs found

    Explicit Stabilised Gradient Descent for Faster Strongly Convex Optimisation

    Get PDF
    This paper introduces the Runge-Kutta Chebyshev descent method (RKCD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes that avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, this paper proves that RKCD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of RKCD diminishes as the condition number of the quadratic function worsens. It is established that this optimal rate is obtained also for a partitioned variant of RKCD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that RKCD outperforms Nesterov's accelerated gradient descent

    Explicit methods for stiff stochastic differential equations

    Get PDF
    Multiscale differential equations arise in the modeling of many important problems in the science and engineering. Numerical solvers for such problems have been extensively studied in the deterministic case. Here, we discuss numerical methods for (mean-square stable) stiff stochastic differential equations. Standard explicit methods, as for example the Euler-Maruyama method, face severe stepsize restriction when applied to stiff problems. Fully implicit methods are usually not appropriate for stochastic problems and semi-implicit methods (implicit in the deterministic part) involve the solution of possibly large linear systems at each time-step. In this paper, we present a recent generalization of explicit stabilized methods, known as Chebyshev methods, to stochastic problems. These methods have much better (mean-square) stability properties than standard explicit methods. We discuss the construction of this new class of methods and illustrate their performance on various problems involving stochastic ordinary and partial differential equations
    corecore