743 research outputs found

    The Douglas-Rachford algorithm for two (not necessarily intersecting) affine subspaces

    Full text link
    The Douglas--Rachford algorithm is a classical and very successful splitting method for finding the zeros of the sums of monotone operators. When the underlying operators are normal cone operators, the algorithm solves a convex feasibility problem. In this paper, we provide a detailed study of the Douglas--Rachford iterates and the corresponding {shadow sequence} when the sets are affine subspaces that do not necessarily intersect. We prove strong convergence of the shadows to the nearest generalized solution. Our results extend recent work from the consistent to the inconsistent case. Various examples are provided to illustrates the results

    On the order of the operators in the Douglas-Rachford algorithm

    Full text link
    The Douglas-Rachford algorithm is a popular method for finding zeros of sums of monotone operators. By its definition, the Douglas-Rachford operator is not symmetric with respect to the order of the two operators. In this paper we provide a systematic study of the two possible Douglas-Rachford operators. We show that the reflectors of the underlying operators act as bijections between the fixed points sets of the two Douglas-Rachford operators. Some elegant formulae arise under additional assumptions. Various examples illustrate our results.Comment: 10 page

    Solution Refinement at Regular Points of Conic Problems

    Full text link
    Most numerical methods for conic problems use the homogenous primal-dual embedding, which yields a primal-dual solution or a certificate establishing primal or dual infeasibility. Following Patrinos (and others, 2018), we express the embedding as the problem of finding a zero of a mapping containing a skew-symmetric linear function and projections onto cones and their duals. We focus on the special case when this mapping is regular, i.e., differentiable with nonsingular derivative matrix, at a solution point. While this is not always the case, it is a very common occurrence in practice. We propose a simple method that uses LSQR, a variant of conjugate gradients for least squares problems, and the derivative of the residual mapping to refine an approximate solution, i.e., to increase its accuracy. LSQR is a matrix-free method, i.e., requires only the evaluation of the derivative mapping and its adjoint, and so avoids forming or storing large matrices, which makes it efficient even for cone problems in which the data matrices are given and dense, and also allows the method to extend to cone programs in which the data are given as abstract linear operators. Numerical examples show that the method almost always improves an approximate solution of a conic program, and often dramatically, at a computational cost that is typically small compared to the cost of obtaining the original approximate solution. For completeness we describe methods for computing the derivative of the projection onto the cones commonly used in practice: nonnegative, second-order, semidefinite, and exponential cones. The paper is accompanied by an open source implementation

    The magnitude of the minimal displacement vector for compositions and convex combinations of firmly nonexpansive mappings

    Full text link
    Maximally monotone operators and firmly nonexpansive mappings play key roles in modern optimization and nonlinear analysis. Five years ago, it was shown that if finitely many firmly nonexpansive operators are all asymptotically regular (i.e., the have or "almost have" fixed points), then the same is true for compositions and convex combinations. In this paper, we derive bounds on the magnitude of the minimal displacement vectors of compositions and of convex combinations in terms of the displacement vectors of the underlying operators. Our results completely generalize earlier works. Moreover, we present various examples illustrating that our bounds are sharp

    On the Douglas-Rachford algorithm

    Full text link
    The Douglas-Rachford algorithm is a very popular splitting technique for finding a zero of the sum of two maximally monotone operators. However, the behaviour of the algorithm remains mysterious in the general inconsistent case, i.e., when the sum problem has no zeros. More than a decade ago, however, it was shown that in the (possibly inconsistent) convex feasibility setting, the shadow sequence remains bounded and it is weak cluster points solve a best approximation problem. In this paper, we advance the understanding of the inconsistent case significantly by providing a complete proof of the full weak convergence in the convex feasibility setting. In fact, a more general sufficient condition for the weak convergence in the general case is presented. Several examples illustrate the results

    Affine nonexpansive operators, Attouch-Th\'era duality and the Douglas-Rachford algorithm

    Full text link
    The Douglas-Rachford splitting algorithm was originally proposed in 1956 to solve a system of linear equations arising from the discretization of a partial differential equation. In 1979, Lions and Mercier brought forward a very powerful extension of this method suitable to solve optimization problems. In this paper, we revisit the original affine setting. We provide a powerful convergence result for finding a zero of the sum of two maximally monotone affine relations. As a by product of our analysis, we obtain results concerning the convergence of iterates of affine nonexpansive mappings as well as Attouch-Th\'era duality. Numerous examples are presented

    Generalized monotone operators and their averaged resolvents

    Full text link
    The correspondence between the monotonicity of a (possibly) set-valued operator and the firm nonexpansiveness of its resolvent is a key ingredient in the convergence analysis of many optimization algorithms. Firmly nonexpansive operators form a proper subclass of the more general - but still pleasant from an algorithmic perspective - class of averaged operators. In this paper, we introduce the new notion of conically nonexpansive operators which generalize nonexpansive mappings. We characterize averaged operators as being resolvents of comonotone operators under appropriate scaling. As a consequence, we characterize the proximal point mappings associated with hypoconvex functions as cocoercive operators, or equivalently; as displacement mappings of conically nonexpansive operators. Several examples illustrate our analysis and demonstrate tightness of our results

    Nearly convex sets: fine properties and domains or ranges of subdifferentials of convex functions

    Full text link
    Nearly convex sets play important roles in convex analysis, optimization and theory of monotone operators. We give a systematic study of nearly convex sets, and construct examples of subdifferentials of lower semicontinuous convex functions whose domain or ranges are nonconvex

    Maximally monotone operators with ranges whose closures are not convex and an answer to a recent question by Stephen Simons

    Full text link
    In his recent Proceedings of the AMS paper "Gossez's skew linear map and its pathological maximally monotone multifunctions", Stephen Simons proved that the closure of the range of the sum of the Gossez operator and a multiple of the duality map is nonconvex whenever the scalar is between 0 and 4. The problem of the convexity of that range when the scalar is equal to 4 was explicitly stated. In this paper, we answer this question in the negative for any scalar greater than or equal to 4. We derive this result from an abstract framework that allows us to also obtain a corresponding result for the Fitzpatrick-Phelps integral operator

    On a result of Pazy concerning the asymptotic behaviour of nonexpansive mappings

    Full text link
    In 1971, Pazy presented a beautiful trichotomy result concerning the asymptotic behaviour of the iterates of a nonexpansive mapping. In this note, we analyze the fixed-point free case in more detail. Our results and examples give credence to the conjecture that the iterates always converge cosmically.Comment: 10 page
    • …
    corecore