53,193 research outputs found

    The exact worst-case convergence rate of the alternating direction method of multipliers

    Get PDF
    Recently, semidefinite programming performance estimation has been employed as a strong tool for the worst-case performance analysis of first order methods. In this paper, we derive new non-ergodic convergence rates for the alternating direction method of multipliers (ADMM) by using performance estimation. We give some examples which show the exactness of the given bounds. We also study the linear and R-linear convergence of ADMM. We establish that ADMM enjoys a global linear convergence rate if and only if the dual objective satisfies the Polyak-Lojasiewicz (PL)inequality in the presence of strong convexity. In addition, we give an explicit formula for the linear convergence rate factor. Moreover, we study the R-linear convergence of ADMM under two new scenarios

    An ADMM Algorithm for MPC-based Energy Management in Hybrid Electric Vehicles with Nonlinear Losses

    Full text link
    In this paper we present a convex formulation of the Model Predictive Control (MPC) optimisation for energy management in hybrid electric vehicles, and an Alternating Direction Method of Multipliers (ADMM) algorithm for its solution. We develop a new proof of convexity for the problem that allows the nonlinear dynamics to be modelled as a linear system, then demonstrate the performance of ADMM in comparison with Dynamic Programming (DP) through simulation. The results demonstrate up to two orders of magnitude improvement in solution time for comparable accuracy against DP

    On the Infimal Sub-differential Size of Primal-Dual Hybrid Gradient Method and Beyond

    Full text link
    Primal-dual hybrid gradient method (PDHG, a.k.a. Chambolle and Pock method) is a well-studied algorithm for minimax optimization problems with a bilinear interaction term. Recently, PDHG is used as the base algorithm for a new LP solver PDLP that aims to solve large LP instances by taking advantage of modern computing resources, such as GPU and distributed system. Most of the previous convergence results of PDHG are either on duality gap or on distance to the optimal solution set, which are usually hard to compute during the solving process. In this paper, we propose a new progress metric for analyzing PDHG, which we dub infimal sub-differential size (IDS), by utilizing the geometry of PDHG iterates. IDS is a natural extension of the gradient norm of smooth problems to non-smooth problems, and it is tied with KKT error in the case of LP. Compared to traditional progress metrics for PDHG, IDS always has a finite value and can be computed only using information of the current solution. We show that IDS monotonically decays, and it has an O(1k)\mathcal O(\frac{1}{k}) sublinear rate for solving convex-concave primal-dual problems, and it has a linear convergence rate if the problem further satisfies a regularity condition that is satisfied by applications such as linear programming, quadratic programming, TV-denoising model, etc. The simplicity of our analysis and the monotonic decay of IDS suggest that IDS is a natural progress metric to analyze PDHG. As a by-product of our analysis, we show that the primal-dual gap has O(1k)\mathcal O(\frac{1}{\sqrt{k}}) convergence rate for the last iteration of PDHG for convex-concave problems. The analysis and results on PDHG can be directly generalized to other primal-dual algorithms, for example, proximal point method (PPM), alternating direction method of multipliers (ADMM) and linearized alternating direction method of multipliers (l-ADMM)

    An Extragradient-Based Alternating Direction Method for Convex Minimization

    Get PDF
    In this paper, we consider the problem of minimizing the sum of two convex functions subject to linear linking constraints. The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings. However, many problems arising from statistics, image processing and other fields have the structure that while one of the two functions has easy proximal mapping, the other function is smoothly convex but does not have an easy proximal mapping. Therefore, the classical alternating direction methods cannot be applied. To deal with the difficulty, we propose in this paper an alternating direction method based on extragradients. Under the assumption that the smooth function has a Lipschitz continuous gradient, we prove that the proposed method returns an ϵ\epsilon-optimal solution within O(1/ϵ)O(1/\epsilon) iterations. We apply the proposed method to solve a new statistical model called fused logistic regression. Our numerical experiments show that the proposed method performs very well when solving the test problems. We also test the performance of the proposed method through solving the lasso problem arising from statistics and compare the result with several existing efficient solvers for this problem; the results are very encouraging indeed
    • …
    corecore