12 research outputs found

    Characterizations of Super-regularity and its Variants

    Full text link
    Convergence of projection-based methods for nonconvex set feasibility problems has been established for sets with ever weaker regularity assumptions. What has not kept pace with these developments is analogous results for convergence of optimization problems with correspondingly weak assumptions on the value functions. Indeed, one of the earliest classes of nonconvex sets for which convergence results were obtainable, the class of so-called super-regular sets introduced by Lewis, Luke and Malick (2009), has no functional counterpart. In this work, we amend this gap in the theory by establishing the equivalence between a property slightly stronger than super-regularity, which we call Clarke super-regularity, and subsmootheness of sets as introduced by Aussel, Daniilidis and Thibault (2004). The bridge to functions shows that approximately convex functions studied by Ngai, Luc and Th\'era (2000) are those which have Clarke super-regular epigraphs. Further classes of regularity of functions based on the corresponding regularity of their epigraph are also discussed.Comment: 15 pages, 2 figure

    A Multi-step Inertial Forward--Backward Splitting Method for Non-convex Optimization

    Full text link
    In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the scheme with the help of the Kurdyka-{\L}ojasiewicz property. Then, when the non-smooth part is also partly smooth relative to a smooth submanifold, we establish finite identification of the latter and provide sharp local linear convergence analysis. The proposed method is illustrated on a few problems arising from statistics and machine learning.Comment: This paper is in company with our recent work on Forward--Backward-type splitting methods http://arxiv.org/abs/1503.0370

    Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility

    Full text link
    The problem of finding a vector with the fewest nonzero elements that satisfies an underdetermined system of linear equations is an NP-complete problem that is typically solved numerically via convex heuristics or nicely-behaved nonconvex relaxations. In this work we consider elementary methods based on projections for solving a sparse feasibility problem without employing convex heuristics. In a recent paper Bauschke, Luke, Phan and Wang (2014) showed that, locally, the fundamental method of alternating projections must converge linearly to a solution to the sparse feasibility problem with an affine constraint. In this paper we apply different analytical tools that allow us to show global linear convergence of alternating projections under familiar constraint qualifications. These analytical tools can also be applied to other algorithms. This is demonstrated with the prominent Douglas-Rachford algorithm where we establish local linear convergence of this method applied to the sparse affine feasibility problem.Comment: 29 pages, 2 figures, 37 references. Much expanded version from last submission. Title changed to reflect new development

    Prox-regularity of rank constraint sets and implications for algorithms

    Get PDF
    We present an analysis of sets of matrices with rank less than or equal to a specified number ss. We provide a simple formula for the normal cone to such sets, and use this to show that these sets are prox-regular at all points with rank exactly equal to ss. The normal cone formula appears to be new. This allows for easy application of prior results guaranteeing local linear convergence of the fundamental alternating projection algorithm between sets, one of which is a rank constraint set. We apply this to show local linear convergence of another fundamental algorithm, approximate steepest descent. Our results apply not only to linear systems with rank constraints, as has been treated extensively in the literature, but also nonconvex systems with rank constraints.Comment: 12 pages, 24 references. Revised manuscript to appear in the Journal of Mathematical Imaging and Visio

    A Multi-step Inertial Forward-Backward Splitting Method for Non-convex Optimization

    Get PDF
    Abstract We propose a multi-step inertial Forward-Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the algorithm with the help of the Kurdyka-Łojasiewicz property. Then, when the non-smooth part is also partly smooth relative to a smooth submanifold, we establish finite identification of the latter and provide sharp local linear convergence analysis. The proposed method is illustrated on several problems arising from statistics and machine learning

    Projection Methods in Sparse and Low Rank Feasibility

    Get PDF
    In this thesis, we give an analysis of fixed point algorithms involving projections onto closed, not necessarily convex, subsets of finite dimensional vector spaces. These methods are used in applications such as imaging science, signal processing, and inverse problems. The tools used in the analysis place this work at the intersection of optimization and variational analysis. Based on the underlying optimization problems, this work is devided into two main parts. The first one is the compressed sensing problem. Because the problem is NP-hard, we relax it to a feasibility problem with two sets, namely, the set of vectors with at most s nonzero entries and, for a linear mapping M the affine subspace B of vectors satisfying Mx=p for p given. This problem will be referred to as the sparse-affine-feasibility problem. For the Douglas-Rachford algorithm, we give the proof of linear convergence to a fixed point in the case of a feasibility problem of two affine subspaces. It allows us to conclude a result of local linear convergence of the Douglas-Rachford algorithm in the sparse affine feasibility problem. Proceeding, we name sufficient conditions for the alternating projections algorithm to converge to the intersection of an affine subspace with lower level sets of point symmetric, lower semicontinuous, subadditive functions. This implies convergence of alternating projections to a solution of the sparse affine feasibility problem. Together with a result of local linear convergence of the alternating projections algorithm, this allows us to deduce linear convergence after finitely many steps for any initial point of a sequence of points generated by the alternating projections algorithm. The second part of this dissertation deals with the minimization of the rank of matrices satisfying a set of linear equations. This problem will be called rank-constrained-affine-feasibility problem. The motivation for the analysis of the rank minimization problem comes from the physical application of phase retrieval and a reformulation of the same as a rank minimization problem. We show that, locally, the method of alternating projections must converge at linear rate to a solution of the rank constrained affine feasibility problem
    corecore