371 research outputs found

    Iteration Complexity Analysis of Multi-Block ADMM for a Family of Convex Minimization without Strong Convexity

    Get PDF
    The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems due to its superior practical performance. On the theoretical side however, a counterexample was shown in [7] indicating that the multi-block ADMM for minimizing the sum of NN (N≥3)(N\geq 3) convex functions with NN block variables linked by linear constraints may diverge. It is therefore of great interest to investigate further sufficient conditions on the input side which can guarantee convergence for the multi-block ADMM. The existing results typically require the strong convexity on parts of the objective. In this paper, we present convergence and convergence rate results for the multi-block ADMM applied to solve certain NN-block (N≥3)(N\geq 3) convex minimization problems without requiring strong convexity. Specifically, we prove the following two results: (1) the multi-block ADMM returns an ϵ\epsilon-optimal solution within O(1/ϵ2)O(1/\epsilon^2) iterations by solving an associated perturbation to the original problem; (2) the multi-block ADMM returns an ϵ\epsilon-optimal solution within O(1/ϵ)O(1/\epsilon) iterations when it is applied to solve a certain sharing problem, under the condition that the augmented Lagrangian function satisfies the Kurdyka-Lojasiewicz property, which essentially covers most convex optimization models except for some pathological cases.Comment: arXiv admin note: text overlap with arXiv:1408.426

    On the Global Linear Convergence of the ADMM with Multi-Block Variables

    Full text link
    The alternating direction method of multipliers (ADMM) has been widely used for solving structured convex optimization problems. In particular, the ADMM can solve convex programs that minimize the sum of NN convex functions with NN-block variables linked by some linear constraints. While the convergence of the ADMM for N=2N=2 was well established in the literature, it remained an open problem for a long time whether or not the ADMM for N≥3N \ge 3 is still convergent. Recently, it was shown in [3] that without further conditions the ADMM for N≥3N\ge 3 may actually fail to converge. In this paper, we show that under some easily verifiable and reasonable conditions the global linear convergence of the ADMM when N≥3N\geq 3 can still be assured, which is important since the ADMM is a popular method for solving large scale multi-block optimization models and is known to perform very well in practice even when N≥3N\ge 3. Our study aims to offer an explanation for this phenomenon
    • …
    corecore