1,663 research outputs found
A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
Based on the idea of randomized coordinate descent of -averaged
operators, a randomized primal-dual optimization algorithm is introduced, where
a random subset of coordinates is updated at each iteration. The algorithm
builds upon a variant of a recent (deterministic) algorithm proposed by V\~u
and Condat that includes the well known ADMM as a particular case. The obtained
algorithm is used to solve asynchronously a distributed optimization problem. A
network of agents, each having a separate cost function containing a
differentiable term, seek to find a consensus on the minimum of the aggregate
objective. The method yields an algorithm where at each iteration, a random
subset of agents wake up, update their local estimates, exchange some data with
their neighbors, and go idle. Numerical results demonstrate the attractive
performance of the method. The general approach can be naturally adapted to
other situations where coordinate descent convex optimization algorithms are
used with a random choice of the coordinates.Comment: 10 page
Convex and Network Flow Optimization for Structured Sparsity
We consider a class of learning problems regularized by a structured
sparsity-inducing norm defined as the sum of l_2- or l_infinity-norms over
groups of variables. Whereas much effort has been put in developing fast
optimization techniques when the groups are disjoint or embedded in a
hierarchy, we address here the case of general overlapping groups. To this end,
we present two different strategies: On the one hand, we show that the proximal
operator associated with a sum of l_infinity-norms can be computed exactly in
polynomial time by solving a quadratic min-cost flow problem, allowing the use
of accelerated proximal gradient methods. On the other hand, we use proximal
splitting techniques, and address an equivalent formulation with
non-overlapping groups, but in higher dimension and with additional
constraints. We propose efficient and scalable algorithms exploiting these two
strategies, which are significantly faster than alternative approaches. We
illustrate these methods with several problems such as CUR matrix
factorization, multi-task learning of tree-structured dictionaries, background
subtraction in video sequences, image denoising with wavelets, and topographic
dictionary learning of natural image patches.Comment: to appear in the Journal of Machine Learning Research (JMLR
A forward-backward view of some primal-dual optimization methods in image recovery
A wide array of image recovery problems can be abstracted into the problem of
minimizing a sum of composite convex functions in a Hilbert space. To solve
such problems, primal-dual proximal approaches have been developed which
provide efficient solutions to large-scale optimization problems. The objective
of this paper is to show that a number of existing algorithms can be derived
from a general form of the forward-backward algorithm applied in a suitable
product space. Our approach also allows us to develop useful extensions of
existing algorithms by introducing a variable metric. An illustration to image
restoration is provided
Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization
We propose a new randomized coordinate descent method for a convex
optimization template with broad applications. Our analysis relies on a novel
combination of four ideas applied to the primal-dual gap function: smoothing,
acceleration, homotopy, and coordinate descent with non-uniform sampling. As a
result, our method features the first convergence rate guarantees among the
coordinate descent methods, that are the best-known under a variety of common
structure assumptions on the template. We provide numerical evidence to support
the theoretical results with a comparison to state-of-the-art algorithms.Comment: NIPS 201
- …