373 research outputs found
A Primal-Dual Algorithmic Framework for Constrained Convex Minimization
We present a primal-dual algorithmic framework to obtain approximate
solutions to a prototypical constrained convex optimization problem, and
rigorously characterize how common structural assumptions affect the numerical
efficiency. Our main analysis technique provides a fresh perspective on
Nesterov's excessive gap technique in a structured fashion and unifies it with
smoothing and primal-dual methods. For instance, through the choices of a dual
smoothing strategy and a center point, our framework subsumes decomposition
algorithms, augmented Lagrangian as well as the alternating direction
method-of-multipliers methods as its special cases, and provides optimal
convergence rates on the primal objective residual as well as the primal
feasibility gap of the iterates for all.Comment: This paper consists of 54 pages with 7 tables and 12 figure
Towards Fast-Convergence, Low-Delay and Low-Complexity Network Optimization
Distributed network optimization has been studied for well over a decade.
However, we still do not have a good idea of how to design schemes that can
simultaneously provide good performance across the dimensions of utility
optimality, convergence speed, and delay. To address these challenges, in this
paper, we propose a new algorithmic framework with all these metrics
approaching optimality. The salient features of our new algorithm are
three-fold: (i) fast convergence: it converges with only
iterations that is the fastest speed among all the existing algorithms; (ii)
low delay: it guarantees optimal utility with finite queue length; (iii) simple
implementation: the control variables of this algorithm are based on virtual
queues that do not require maintaining per-flow information. The new technique
builds on a kind of inexact Uzawa method in the Alternating Directional Method
of Multiplier, and provides a new theoretical path to prove global and linear
convergence rate of such a method without requiring the full rank assumption of
the constraint matrix
Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection
Chandrasekaran, Parrilo and Willsky (2010) proposed a convex optimization
problem to characterize graphical model selection in the presence of unobserved
variables. This convex optimization problem aims to estimate an inverse
covariance matrix that can be decomposed into a sparse matrix minus a low-rank
matrix from sample data. Solving this convex optimization problem is very
challenging, especially for large problems. In this paper, we propose two
alternating direction methods for solving this problem. The first method is to
apply the classical alternating direction method of multipliers to solve the
problem as a consensus problem. The second method is a proximal gradient based
alternating direction method of multipliers. Our methods exploit and take
advantage of the special structure of the problem and thus can solve large
problems very efficiently. Global convergence result is established for the
proposed methods. Numerical results on both synthetic data and gene expression
data show that our methods usually solve problems with one million variables in
one to two minutes, and are usually five to thirty five times faster than a
state-of-the-art Newton-CG proximal point algorithm
- …