73 research outputs found

    A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima

    Full text link
    We introduce Bella, a locally superlinearly convergent Bregman forward backward splitting method for minimizing the sum of two nonconvex functions, one of which satisfying a relative smoothness condition and the other one possibly nonsmooth. A key tool of our methodology is the Bregman forward-backward envelope (BFBE), an exact and continuous penalty function with favorable first- and second-order properties, and enjoying a nonlinear error bound when the objective function satisfies a Lojasiewicz-type property. The proposed algorithm is of linesearch type over the BFBE along candidate update directions, and converges subsequentially to stationary points, globally under a KL condition, and owing to the given nonlinear error bound can attain superlinear convergence rates even when the limit point is a nonisolated minimum, provided the directions are suitably selected

    Newton's Method for Solving Inclusions Using Set-Valued Approximations

    No full text
    International audienceResults on stability of both local and global metric regularity under set-valued perturbations are presented. As an application, we study (super)linear convergence of a Newton- type iterative process for solving generalized equations. We investigate several iterative schemes such as the inexact Newton’s method, the nonsmooth Newton’s method for semismooth functions, the inexact proximal point algorithm, etc. Moreover, we also cover a forward-backward splitting algorithm for finding a zero of the sum of two multivalued (not necessarily monotone) operators. Finally, a globalization of the Newton’s method is discussed

    Stability of Singular Equilibria in Quasilinear Implicit Differential Equations

    Get PDF
    AbstractThis paper addresses stability properties of singular equilibria arising in quasilinear implicit ODEs. Under certain assumptions, local dynamics near a singular point may be described through a continuous or directionally continuous vector field. This fact motivates a classification of geometric singularities into weak and strong ones. Stability in the weak case is analyzed through certain linear matrix equations, a singular version of the Lyapunov equation being especially relevant in the study. Weak stable singularities include singular zeros having a spherical domain of attraction which contains other singular points. Regarding strong equilibria, stability is proved via a Lyapunov–Schmidt approach under additional hypotheses. The results are shown to be relevant in singular root-finding problems

    Strong convexity-guided hyper-parameter optimization for flatter losses

    Full text link
    We propose a novel white-box approach to hyper-parameter optimization. Motivated by recent work establishing a relationship between flat minima and generalization, we first establish a relationship between the strong convexity of the loss and its flatness. Based on this, we seek to find hyper-parameter configurations that improve flatness by minimizing the strong convexity of the loss. By using the structure of the underlying neural network, we derive closed-form equations to approximate the strong convexity parameter, and attempt to find hyper-parameters that minimize it in a randomized fashion. Through experiments on 14 classification datasets, we show that our method achieves strong performance at a fraction of the runtime.Comment: v
    • …
    corecore