1,232 research outputs found

    Douglas-Rachford Splitting: Complexity Estimates and Accelerated Variants

    Full text link
    We propose a new approach for analyzing convergence of the Douglas-Rachford splitting method for solving convex composite optimization problems. The approach is based on a continuously differentiable function, the Douglas-Rachford Envelope (DRE), whose stationary points correspond to the solutions of the original (possibly nonsmooth) problem. By proving the equivalence between the Douglas-Rachford splitting method and a scaled gradient method applied to the DRE, results from smooth unconstrained optimization are employed to analyze convergence properties of DRS, to tune the method and to derive an accelerated version of it

    On Convergence of Heuristics Based on Douglas-Rachford Splitting and ADMM to Minimize Convex Functions over Nonconvex Sets

    Full text link
    Recently, heuristics based on the Douglas-Rachford splitting algorithm and the alternating direction method of multipliers (ADMM) have found empirical success in minimizing convex functions over nonconvex sets, but not much has been done to improve the theoretical understanding of them. In this paper, we investigate convergence of these heuristics. First, we characterize optimal solutions of minimization problems involving convex cost functions over nonconvex constraint sets. We show that these optimal solutions are related to the fixed point set of the underlying nonconvex Douglas-Rachford operator. Next, we establish sufficient conditions under which the Douglas-Rachford splitting heuristic either converges to a point or its cluster points form a nonempty compact connected set. In the case where the heuristic converges to a point, we establish sufficient conditions for that point to be an optimal solution. Then, we discuss how the ADMM heuristic can be constructed from the Douglas-Rachford splitting algorithm. We show that, unlike in the convex case, the algorithms in our nonconvex setup are not equivalent to each other and have a rather involved relationship between them. Finally, we comment on convergence of the ADMM heuristic and compare it with the Douglas-Rachford splitting heuristic.Comment: 11 page

    Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM

    Full text link
    Recently, several convergence rate results for Douglas-Rachford splitting and the alternating direction method of multipliers (ADMM) have been presented in the literature. In this paper, we show global linear convergence rate bounds for Douglas-Rachford splitting and ADMM under strong convexity and smoothness assumptions. We further show that the rate bounds are tight for the class of problems under consideration for all feasible algorithm parameters. For problems that satisfy the assumptions, we show how to select step-size and metric for the algorithm that optimize the derived convergence rate bounds. For problems with a similar structure that do not satisfy the assumptions, we present heuristic step-size and metric selection methods

    Inertial Douglas-Rachford splitting for monotone inclusion problems

    Full text link
    We propose an inertial Douglas-Rachford splitting algorithm for finding the set of zeros of the sum of two maximally monotone operators in Hilbert spaces and investigate its convergence properties. To this end we formulate first the inertial version of the Krasnosel'ski\u{\i}--Mann algorithm for approximating the set of fixed points of a nonexpansive operator, for which we also provide an exhaustive convergence analysis. By using a product space approach we employ these results to the solving of monotone inclusion problems involving linearly composed and parallel-sum type operators and provide in this way iterative schemes where each of the maximally monotone mappings is accessed separately via its resolvent. We consider also the special instance of solving a primal-dual pair of nonsmooth convex optimization problems and illustrate the theoretical results via some numerical experiments in clustering and location theory.Comment: arXiv admin note: text overlap with arXiv:1402.529
    • …
    corecore