275 research outputs found

    Splitting methods with variable metric for KL functions

    Full text link
    We study the convergence of general abstract descent methods applied to a lower semicontinuous nonconvex function f that satisfies the Kurdyka-Lojasiewicz inequality in a Hilbert space. We prove that any precompact sequence converges to a critical point of f and obtain new convergence rates both for the values and the iterates. The analysis covers alternating versions of the forward-backward method with variable metric and relative errors. As an example, a nonsmooth and nonconvex version of the Levenberg-Marquardt algorithm is detailled

    Optimization with Sparsity-Inducing Penalties

    Get PDF
    Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate non-smooth norms. The goal of this paper is to present from a general perspective optimization tools and techniques dedicated to such sparsity-inducing penalties. We cover proximal methods, block-coordinate descent, reweighted â„“2\ell_2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provide an extensive set of experiments to compare various algorithms from a computational point of view

    A Unified Bregman Alternating Minimization Algorithm for Generalized DC Programming with Application to Imaging Data

    Full text link
    In this paper, we consider a class of nonconvex (not necessarily differentiable) optimization problems called generalized DC (Difference-of-Convex functions) programming, which is minimizing the sum of two separable DC parts and one two-block-variable coupling function. To circumvent the nonconvexity and nonseparability of the problem under consideration, we accordingly introduce a Unified Bregman Alternating Minimization Algorithm (UBAMA) by maximally exploiting the favorable DC structure of the objective. Specifically, we first follow the spirit of alternating minimization to update each block variable in a sequential order, which can efficiently tackle the nonseparablitity caused by the coupling function. Then, we employ the Fenchel-Young inequality to approximate the second DC components (i.e., concave parts) so that each subproblem reduces to a convex optimization problem, thereby alleviating the computational burden of the nonconvex DC parts. Moreover, each subproblem absorbs a Bregman proximal regularization term, which is usually beneficial for inducing closed-form solutions of subproblems for many cases via choosing appropriate Bregman kernel functions. It is remarkable that our algorithm not only provides an algorithmic framework to understand the iterative schemes of some novel existing algorithms, but also enjoys implementable schemes with easier subproblems than some state-of-the-art first-order algorithms developed for generic nonconvex and nonsmooth optimization problems. Theoretically, we prove that the sequence generated by our algorithm globally converges to a critical point under the Kurdyka-{\L}ojasiewicz (K{\L}) condition. Besides, we estimate the local convergence rates of our algorithm when we further know the prior information of the K{\L} exponent.Comment: 44 pages, 7figures, 5 tables. Any comments are welcom

    Two-step inertial Bregman proximal alternating linearized minimization algorithm for nonconvex and nonsmooth problems

    Full text link
    In this paper, we study an algorithm for solving a class of nonconvex and nonsmooth nonseparable optimization problems. Based on proximal alternating linearized minimization (PALM), we propose a new iterative algorithm which combines two-step inertial extrapolation and Bregman distance. By constructing appropriate benefit function, with the help of Kurdyka--{\L}ojasiewicz property we establish the convergence of the whole sequence generated by proposed algorithm. We apply the algorithm to signal recovery, quadratic fractional programming problem and show the effectiveness of proposed algorithm.Comment: 28 pages, 8 figures, 4 tables. arXiv admin note: text overlap with arXiv:2306.0420
    • …
    corecore