381 research outputs found

    A Multi-step Inertial Forward--Backward Splitting Method for Non-convex Optimization

    Full text link
    In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the scheme with the help of the Kurdyka-{\L}ojasiewicz property. Then, when the non-smooth part is also partly smooth relative to a smooth submanifold, we establish finite identification of the latter and provide sharp local linear convergence analysis. The proposed method is illustrated on a few problems arising from statistics and machine learning.Comment: This paper is in company with our recent work on Forward--Backward-type splitting methods http://arxiv.org/abs/1503.0370

    On the convergence of a linesearch based proximal-gradient method for nonconvex optimization

    Get PDF
    We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka-Lojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that our algorithm results to be flexible, robust and competitive when compared to recently proposed approaches able to address the optimization problems arising in the considered applications

    A proximal minimization algorithm for structured nonconvex and nonsmooth problems

    Get PDF
    We propose a proximal algorithm for minimizing objective functions consisting of three summands: the composition of a nonsmooth function with a linear operator, another nonsmooth function, each of the nonsmooth summands depending on an independent block variable, and a smooth function which couples the two block variables. The algorithm is a full splitting method, which means that the nonsmooth functions are processed via their proximal operators, the smooth function via gradient steps, and the linear operator via matrix times vector multiplication. We provide sufficient conditions for the boundedness of the generated sequence and prove that any cluster point of the latter is a KKT point of the minimization problem. In the setting of the Kurdyka-\L{}ojasiewicz property we show global convergence, and derive convergence rates for the iterates in terms of the \L{}ojasiewicz exponent
    corecore