78 research outputs found
Optimal Convergence Rates for Generalized Alternating Projections
Generalized alternating projections is an algorithm that alternates relaxed
projections onto a finite number of sets to find a point in their intersection.
We consider the special case of two linear subspaces, for which the algorithm
reduces to a matrix teration. For convergent matrix iterations, the asymptotic
rate is linear and decided by the magnitude of the subdominant eigenvalue. In
this paper, we show how to select the three algorithm parameters to optimize
this magnitude, and hence the asymptotic convergence rate. The obtained rate
depends on the Friedrichs angle between the subspaces and is considerably
better than known rates for other methods such as alternating projections and
Douglas-Rachford splitting. We also present an adaptive scheme that, online,
estimates the Friedrichs angle and updates the algorithm parameters based on
this estimate. A numerical example is provided that supports our theoretical
claims and shows very good performance for the adaptive method.Comment: 20 pages, extended version of article submitted to CD
Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility
The problem of finding a vector with the fewest nonzero elements that
satisfies an underdetermined system of linear equations is an NP-complete
problem that is typically solved numerically via convex heuristics or
nicely-behaved nonconvex relaxations. In this work we consider elementary
methods based on projections for solving a sparse feasibility problem without
employing convex heuristics. In a recent paper Bauschke, Luke, Phan and Wang
(2014) showed that, locally, the fundamental method of alternating projections
must converge linearly to a solution to the sparse feasibility problem with an
affine constraint. In this paper we apply different analytical tools that allow
us to show global linear convergence of alternating projections under familiar
constraint qualifications. These analytical tools can also be applied to other
algorithms. This is demonstrated with the prominent Douglas-Rachford algorithm
where we establish local linear convergence of this method applied to the
sparse affine feasibility problem.Comment: 29 pages, 2 figures, 37 references. Much expanded version from last
submission. Title changed to reflect new development
Convergence Analysis and Improvements for Projection Algorithms and Splitting Methods
Non-smooth convex optimization problems occur in all fields of engineering. A common approach to solving this class of problems is proximal algorithms, or splitting methods. These first-order optimization algorithms are often simple, well suited to solve large-scale problems and have a low computational cost per iteration. Essentially, they encode the solution to an optimization problem as a fixed point of some operator, and iterating this operator eventually results in convergence to an optimal point. However, as for other first order methods, the convergence rate is heavily dependent on the conditioning of the problem. Even though the per-iteration cost is usually low, the number of iterations can become prohibitively large for ill-conditioned problems, especially if a high accuracy solution is sought.In this thesis, a few methods for alleviating this slow convergence are studied, which can be divided into two main approaches. The first are heuristic methods that can be applied to a range of fixed-point algorithms. They are based on understanding typical behavior of these algorithms. While these methods are shown to converge, they come with no guarantees on improved convergence rates.The other approach studies the theoretical rates of a class of projection methods that are used to solve convex feasibility problems. These are problems where the goal is to find a point in the intersection of two, or possibly more, convex sets. A study of how the parameters in the algorithm affect the theoretical convergence rate is presented, as well as how they can be chosen to optimize this rate
A new projection method for finding the closest point in the intersection of convex sets
In this paper we present a new iterative projection method for finding the
closest point in the intersection of convex sets to any arbitrary point in a
Hilbert space. This method, termed AAMR for averaged alternating modified
reflections, can be viewed as an adequate modification of the Douglas--Rachford
method that yields a solution to the best approximation problem. Under a
constraint qualification at the point of interest, we show strong convergence
of the method. In fact, the so-called strong CHIP fully characterizes the
convergence of the AAMR method for every point in the space. We report some
promising numerical experiments where we compare the performance of AAMR
against other projection methods for finding the closest point in the
intersection of pairs of finite dimensional subspaces
Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
We consider projection algorithms for solving (nonconvex) feasibility
problems in Euclidean spaces. Of special interest are the Method of Alternating
Projections (MAP) and the Douglas-Rachford or Averaged Alternating Reflection
Algorithm (AAR). In the case of convex feasibility, firm nonexpansiveness of
projection mappings is a global property that yields global convergence of MAP
and for consistent problems AAR. Based on (\epsilon, \delta)-regularity of sets
developed by Bauschke, Luke, Phan and Wang in 2012, a relaxed local version of
firm nonexpansiveness with respect to the intersection is introduced for
consistent feasibility problems. Together with a coercivity condition that
relates to the regularity of the intersection, this yields local linear
convergence of MAP for a wide class of nonconvex problems,Comment: 22 pages, no figures, 30 reference
- …