33 research outputs found

    Generalized bregman envelopes and proximity operators

    Get PDF
    Every maximally monotone operator can be associated with a family of convex functions, called the Fitzpatrick family or family of representative functions. Surprisingly, in 2017, Burachik and Martínez-Legaz showed that the well-known Bregman distance is a particular case of a general family of distances, each one induced by a specific maximally monotone operator and a specific choice of one of its representative functions. For the family of generalized Bregman distances, sufficient conditions for convexity, coercivity, and supercoercivity have recently been furnished. Motivated by these advances, we introduce in the present paper the generalized left and right envelopes and proximity operators, and we provide asymptotic results for parameters. Certain results extend readily from the more specific Bregman context, while others only extend for certain generalized cases. To illustrate, we construct examples from the Bregman generalizing case, together with the natural “extreme” cases that highlight the importance of which generalized Bregman distance is chosen. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature

    The generalized bregman distance

    Get PDF
    Recently, a new kind of distance has been introduced for the graphs of two point-to-set operators, one of which is maximally monotone. When both operators are the subdifferential of a proper lower semicontinuous convex function, this kind of distance specializes under modest assumptions to the classical Bregman distance. We name this new kind of distance the generalized Bregman distance, and we shed light on it with examples that utilize the other two most natural representative functions: the Fitzpatrick function and its conjugate. We provide sufficient conditions for convexity, coercivity, and supercoercivity: properties which are essential for implementation in proximal point type algorithms. We establish these results for both the left and right variants of this new kind of distance. We construct examples closely related to the Kullback-Leibler divergence, which was previously considered in the context of Bregman distances and whose importance in information theory is well known. In so doing, we demonstrate how to compute a difficult Fitzpatrick conjugate function, and we discover natural occurrences of the Lambert \scrW function, whose importance in optimization is of growing interest. © 2021 Society for Industrial and Applied Mathematic

    A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima

    Full text link
    We introduce Bella, a locally superlinearly convergent Bregman forward backward splitting method for minimizing the sum of two nonconvex functions, one of which satisfying a relative smoothness condition and the other one possibly nonsmooth. A key tool of our methodology is the Bregman forward-backward envelope (BFBE), an exact and continuous penalty function with favorable first- and second-order properties, and enjoying a nonlinear error bound when the objective function satisfies a Lojasiewicz-type property. The proposed algorithm is of linesearch type over the BFBE along candidate update directions, and converges subsequentially to stationary points, globally under a KL condition, and owing to the given nonlinear error bound can attain superlinear convergence rates even when the limit point is a nonisolated minimum, provided the directions are suitably selected

    Study the Convergence of Moreau – Bregman Envelope in Reflexive Banach Spaces.

    Get PDF
      It is often useful to replace a function with a sequence of smooth functions approximating the given function to resolve minimizing optimization problems.  The most famous one is the Moreau envelope. Recently the function was organized using the Bregman distance   . It is worth noting that Bregman distance  is not a distance in the usual sense of the term. In general, it  is not symmetric and it does not satisfy the triangle inequality The purpose of the research is to study the convergence of the Moreau envelope function and the related proximal mapping depends on Bregman Distance for a function on Banach space. Proved equivalence between Mosco-epi-convergence of sequence functions and pointwise convergence of Moreau-Bregman  envelope  We also studied the strong and weak convergence of resolvent operators  According to the concept of Bregman distance

    Learning Gaussian Mixtures with Generalised Linear Models: Precise Asymptotics in High-dimensions

    Full text link
    Generalised linear models for multi-class classification problems are one of the fundamental building blocks of modern machine learning tasks. In this manuscript, we characterise the learning of a mixture of KK Gaussians with generic means and covariances via empirical risk minimisation (ERM) with any convex loss and regularisation. In particular, we prove exact asymptotics characterising the ERM estimator in high-dimensions, extending several previous results about Gaussian mixture classification in the literature. We exemplify our result in two tasks of interest in statistical learning: a) classification for a mixture with sparse means, where we study the efficiency of 1\ell_1 penalty with respect to 2\ell_2; b) max-margin multi-class classification, where we characterise the phase transition on the existence of the multi-class logistic maximum likelihood estimator for K>2K>2. Finally, we discuss how our theory can be applied beyond the scope of synthetic data, showing that in different cases Gaussian mixtures capture closely the learning curve of classification tasks in real data sets.Comment: 12 pages + 34 pages of Appendix, 10 figure
    corecore