35 research outputs found
The Douglas-Rachford algorithm for two (not necessarily intersecting) affine subspaces
The Douglas--Rachford algorithm is a classical and very successful splitting
method for finding the zeros of the sums of monotone operators. When the
underlying operators are normal cone operators, the algorithm solves a convex
feasibility problem. In this paper, we provide a detailed study of the
Douglas--Rachford iterates and the corresponding {shadow sequence} when the
sets are affine subspaces that do not necessarily intersect. We prove strong
convergence of the shadows to the nearest generalized solution. Our results
extend recent work from the consistent to the inconsistent case. Various
examples are provided to illustrates the results
On the order of the operators in the Douglas-Rachford algorithm
The Douglas-Rachford algorithm is a popular method for finding zeros of sums
of monotone operators. By its definition, the Douglas-Rachford operator is not
symmetric with respect to the order of the two operators. In this paper we
provide a systematic study of the two possible Douglas-Rachford operators. We
show that the reflectors of the underlying operators act as bijections between
the fixed points sets of the two Douglas-Rachford operators. Some elegant
formulae arise under additional assumptions. Various examples illustrate our
results.Comment: 10 page
The magnitude of the minimal displacement vector for compositions and convex combinations of firmly nonexpansive mappings
Maximally monotone operators and firmly nonexpansive mappings play key roles
in modern optimization and nonlinear analysis. Five years ago, it was shown
that if finitely many firmly nonexpansive operators are all asymptotically
regular (i.e., the have or "almost have" fixed points), then the same is true
for compositions and convex combinations. In this paper, we derive bounds on
the magnitude of the minimal displacement vectors of compositions and of convex
combinations in terms of the displacement vectors of the underlying operators.
Our results completely generalize earlier works. Moreover, we present various
examples illustrating that our bounds are sharp
On the Douglas-Rachford algorithm
The Douglas-Rachford algorithm is a very popular splitting technique for
finding a zero of the sum of two maximally monotone operators. However, the
behaviour of the algorithm remains mysterious in the general inconsistent case,
i.e., when the sum problem has no zeros. More than a decade ago, however, it
was shown that in the (possibly inconsistent) convex feasibility setting, the
shadow sequence remains bounded and it is weak cluster points solve a best
approximation problem.
In this paper, we advance the understanding of the inconsistent case
significantly by providing a complete proof of the full weak convergence in the
convex feasibility setting. In fact, a more general sufficient condition for
the weak convergence in the general case is presented. Several examples
illustrate the results
Nearly convex sets: fine properties and domains or ranges of subdifferentials of convex functions
Nearly convex sets play important roles in convex analysis, optimization and
theory of monotone operators. We give a systematic study of nearly convex sets,
and construct examples of subdifferentials of lower semicontinuous convex
functions whose domain or ranges are nonconvex
Generalized monotone operators and their averaged resolvents
The correspondence between the monotonicity of a (possibly) set-valued
operator and the firm nonexpansiveness of its resolvent is a key ingredient in
the convergence analysis of many optimization algorithms. Firmly nonexpansive
operators form a proper subclass of the more general - but still pleasant from
an algorithmic perspective - class of averaged operators. In this paper, we
introduce the new notion of conically nonexpansive operators which generalize
nonexpansive mappings. We characterize averaged operators as being resolvents
of comonotone operators under appropriate scaling. As a consequence, we
characterize the proximal point mappings associated with hypoconvex functions
as cocoercive operators, or equivalently; as displacement mappings of conically
nonexpansive operators. Several examples illustrate our analysis and
demonstrate tightness of our results
Affine nonexpansive operators, Attouch-Th\'era duality and the Douglas-Rachford algorithm
The Douglas-Rachford splitting algorithm was originally proposed in 1956 to
solve a system of linear equations arising from the discretization of a partial
differential equation. In 1979, Lions and Mercier brought forward a very
powerful extension of this method suitable to solve optimization problems.
In this paper, we revisit the original affine setting. We provide a powerful
convergence result for finding a zero of the sum of two maximally monotone
affine relations. As a by product of our analysis, we obtain results concerning
the convergence of iterates of affine nonexpansive mappings as well as
Attouch-Th\'era duality. Numerous examples are presented
Maximally monotone operators with ranges whose closures are not convex and an answer to a recent question by Stephen Simons
In his recent Proceedings of the AMS paper "Gossez's skew linear map and its
pathological maximally monotone multifunctions", Stephen Simons proved that the
closure of the range of the sum of the Gossez operator and a multiple of the
duality map is nonconvex whenever the scalar is between 0 and 4. The problem of
the convexity of that range when the scalar is equal to 4 was explicitly
stated. In this paper, we answer this question in the negative for any scalar
greater than or equal to 4. We derive this result from an abstract framework
that allows us to also obtain a corresponding result for the Fitzpatrick-Phelps
integral operator
Douglas-Rachford splitting for a Lipschitz continuous and a strongly monotone operator
The Douglas-Rachford method is a popular splitting technique for finding a
zero of the sum of two subdifferential operators of proper closed convex
functions; more generally two maximally monotone operators. Recent results
concerned with linear rates of convergence of the method require additional
properties of the underlying monotone operators, such as strong monotonicity
and cocoercivity. In this paper, we study the case when one operator is
Lipschitz continuous but not necessarily a subdifferential operator and the
other operator is strongly monotone. This situation arises in optimization
methods which involve primal-dual approaches. We provide new linear convergence
results in this setting
A note on the equivalence of operator splitting methods
This paper provides a comprehensive discussion of the equivalences between
splitting methods. These equivalences have been studied over the past few
decades and, in fact, have proven to be very useful. In this paper, we survey
known results and also present new ones. In particular, we provide simplified
proofs of the equivalence of the ADMM and the Douglas-Rachford method and the
equivalence of the ADMM with intermediate update of multipliers and the
Peaceman-Rachford method. Other splitting methods are also considered