67 research outputs found
Optimal Convergence Rates for Generalized Alternating Projections
Generalized alternating projections is an algorithm that alternates relaxed
projections onto a finite number of sets to find a point in their intersection.
We consider the special case of two linear subspaces, for which the algorithm
reduces to a matrix teration. For convergent matrix iterations, the asymptotic
rate is linear and decided by the magnitude of the subdominant eigenvalue. In
this paper, we show how to select the three algorithm parameters to optimize
this magnitude, and hence the asymptotic convergence rate. The obtained rate
depends on the Friedrichs angle between the subspaces and is considerably
better than known rates for other methods such as alternating projections and
Douglas-Rachford splitting. We also present an adaptive scheme that, online,
estimates the Friedrichs angle and updates the algorithm parameters based on
this estimate. A numerical example is provided that supports our theoretical
claims and shows very good performance for the adaptive method.Comment: 20 pages, extended version of article submitted to CD
A new projection method for finding the closest point in the intersection of convex sets
In this paper we present a new iterative projection method for finding the
closest point in the intersection of convex sets to any arbitrary point in a
Hilbert space. This method, termed AAMR for averaged alternating modified
reflections, can be viewed as an adequate modification of the Douglas--Rachford
method that yields a solution to the best approximation problem. Under a
constraint qualification at the point of interest, we show strong convergence
of the method. In fact, the so-called strong CHIP fully characterizes the
convergence of the AAMR method for every point in the space. We report some
promising numerical experiments where we compare the performance of AAMR
against other projection methods for finding the closest point in the
intersection of pairs of finite dimensional subspaces
Activity Identification and Local Linear Convergence of Douglas--Rachford/ADMM under Partial Smoothness
Convex optimization has become ubiquitous in most quantitative disciplines of
science, including variational image processing. Proximal splitting algorithms
are becoming popular to solve such structured convex optimization problems.
Within this class of algorithms, Douglas--Rachford (DR) and alternating
direction method of multipliers (ADMM) are designed to minimize the sum of two
proper lower semi-continuous convex functions whose proximity operators are
easy to compute. The goal of this work is to understand the local convergence
behaviour of DR (resp. ADMM) when the involved functions (resp. their
Legendre-Fenchel conjugates) are moreover partly smooth. More precisely, when
both of the two functions (resp. their conjugates) are partly smooth relative
to their respective manifolds, we show that DR (resp. ADMM) identifies these
manifolds in finite time. Moreover, when these manifolds are affine or linear,
we prove that DR/ADMM is locally linearly convergent. When and are
locally polyhedral, we show that the optimal convergence radius is given in
terms of the cosine of the Friedrichs angle between the tangent spaces of the
identified manifolds. This is illustrated by several concrete examples and
supported by numerical experiments.Comment: 17 pages, 1 figure, published in the proceedings of the Fifth
International Conference on Scale Space and Variational Methods in Computer
Visio
Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
We consider projection algorithms for solving (nonconvex) feasibility
problems in Euclidean spaces. Of special interest are the Method of Alternating
Projections (MAP) and the Douglas-Rachford or Averaged Alternating Reflection
Algorithm (AAR). In the case of convex feasibility, firm nonexpansiveness of
projection mappings is a global property that yields global convergence of MAP
and for consistent problems AAR. Based on (\epsilon, \delta)-regularity of sets
developed by Bauschke, Luke, Phan and Wang in 2012, a relaxed local version of
firm nonexpansiveness with respect to the intersection is introduced for
consistent feasibility problems. Together with a coercivity condition that
relates to the regularity of the intersection, this yields local linear
convergence of MAP for a wide class of nonconvex problems,Comment: 22 pages, no figures, 30 reference
Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility
The problem of finding a vector with the fewest nonzero elements that
satisfies an underdetermined system of linear equations is an NP-complete
problem that is typically solved numerically via convex heuristics or
nicely-behaved nonconvex relaxations. In this work we consider elementary
methods based on projections for solving a sparse feasibility problem without
employing convex heuristics. In a recent paper Bauschke, Luke, Phan and Wang
(2014) showed that, locally, the fundamental method of alternating projections
must converge linearly to a solution to the sparse feasibility problem with an
affine constraint. In this paper we apply different analytical tools that allow
us to show global linear convergence of alternating projections under familiar
constraint qualifications. These analytical tools can also be applied to other
algorithms. This is demonstrated with the prominent Douglas-Rachford algorithm
where we establish local linear convergence of this method applied to the
sparse affine feasibility problem.Comment: 29 pages, 2 figures, 37 references. Much expanded version from last
submission. Title changed to reflect new development
- …