1 research outputs found
Bounding the error of discretized Langevin algorithms for non-strongly log-concave targets
In this paper, we provide non-asymptotic upper bounds on the error of
sampling from a target density using three schemes of discretized Langevin
diffusions. The first scheme is the Langevin Monte Carlo (LMC) algorithm, the
Euler discretization of the Langevin diffusion. The second and the third
schemes are, respectively, the kinetic Langevin Monte Carlo (KLMC) for
differentiable potentials and the kinetic Langevin Monte Carlo for
twice-differentiable potentials (KLMC2). The main focus is on the target
densities that are smooth and log-concave on , but not necessarily
strongly log-concave. Bounds on the computational complexity are obtained under
two types of smoothness assumption: the potential has a Lipschitz-continuous
gradient and the potential has a Lipschitz-continuous Hessian matrix. The error
of sampling is measured by Wasserstein- distances. We advocate for the use
of a new dimension-adapted scaling in the definition of the computational
complexity, when Wasserstein- distances are considered. The obtained results
show that the number of iterations to achieve a scaled-error smaller than a
prescribed value depends only polynomially in the dimension