74 research outputs found

    A stochastic preconditioned Douglas-Rachford splitting method for saddle-point problems

    Full text link
    In this article, we propose and study a stochastic preconditioned Douglas-Rachford splitting method to solve saddle-point problems which have separable dual variables. We prove the almost sure convergence of the iteration sequences in Hilbert spaces for a class of convexconcave and nonsmooth saddle-point problems. We also provide the sublinear convergence rate for the ergodic sequence with respect to the expectation of the restricted primal-dual gap functions. Numerical experiments show the high efficiency of the proposed stochastic preconditioned Douglas-Rachford splitting methods

    An inertial forward-backward algorithm for monotone inclusions

    Full text link
    In this paper, we propose an inertial forward backward splitting algorithm to compute a zero of the sum of two monotone operators, with one of the two operators being co-coercive. The algorithm is inspired by the accelerated gradient method of Nesterov, but can be applied to a much larger class of problems including convex-concave saddle point problems and general monotone inclusions. We prove convergence of the algorithm in a Hilbert space setting and show that several recently proposed first-order methods can be obtained as special cases of the general algorithm. Numerical results show that the proposed algorithm converges faster than existing methods, while keeping the computational cost of each iteration basically unchanged.Comment: The final publication is available at http://link.springer.co

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+αR(u)minu{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and α\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or 1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure

    First-order primal-dual methods for nonsmooth nonconvex optimisation

    Full text link
    We provide an overview of primal-dual algorithms for nonsmooth and non-convex-concave saddle-point problems. This flows around a new analysis of such methods, using Bregman divergences to formulate simplified conditions for convergence
    corecore