896 research outputs found

    Union Averaged Operators with Applications to Proximal Algorithms for Min-Convex Functions

    Full text link
    In this paper we introduce and study a class of structured set-valued operators which we call union averaged nonexpansive. At each point in their domain, the value of such an operator can be expressed as a finite union of single-valued averaged nonexpansive operators. We investigate various structural properties of the class and show, in particular, that is closed under taking unions, convex combinations, and compositions, and that their fixed point iterations are locally convergent around strong fixed points. We then systematically apply our results to analyze proximal algorithms in situations where union averaged nonexpansive operators naturally arise. In particular, we consider the problem of minimizing the sum two functions where the first is convex and the second can be expressed as the minimum of finitely many convex functions

    Generalized Forward-Backward Splitting

    Full text link
    This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form F+∑i=1nGiF + \sum_{i=1}^n G_i, where FF has a Lipschitz-continuous gradient and the GiG_i's are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than n=1n = 1 non-smooth function, our method generalizes it to the case of arbitrary nn. Our method makes an explicit use of the regularity of FF in the forward step, and the proximity operators of the GiG_i's are applied in parallel in the backward step. This allows the generalized forward backward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of FF. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.Comment: 24 pages, 4 figure

    Forward-backward truncated Newton methods for convex composite optimization

    Full text link
    This paper proposes two proximal Newton-CG methods for convex nonsmooth optimization problems in composite form. The algorithms are based on a a reformulation of the original nonsmooth problem as the unconstrained minimization of a continuously differentiable function, namely the forward-backward envelope (FBE). The first algorithm is based on a standard line search strategy, whereas the second one combines the global efficiency estimates of the corresponding first-order methods, while achieving fast asymptotic convergence rates. Furthermore, they are computationally attractive since each Newton iteration requires the approximate solution of a linear system of usually small dimension
    • …
    corecore