3 research outputs found

    High-Dimensional Non-Convex Landscapes and Gradient Descent Dynamics

    Full text link
    In these lecture notes we present different methods and concepts developed in statistical physics to analyze gradient descent dynamics in high-dimensional non-convex landscapes. Our aim is to show how approaches developed in physics, mainly statistical physics of disordered systems, can be used to tackle open questions on high-dimensional dynamics in Machine Learning.Comment: Lectures given by G. Biroli at the 2022 Les Houches Summer School "Statistical Physics and Machine Learning

    Dynamical mean-field theory for stochastic gradient descent in Gaussian mixture classification

    No full text
    8 pages + appendix, 4 figuresInternational audienceWe analyze in a closed form the learning dynamics of stochastic gradient descent (SGD) for a single layer neural network classifying a high-dimensional Gaussian mixture where each cluster is assigned one of two labels. This problem provides a prototype of a non-convex loss landscape with interpolating regimes and a large generalization gap. We define a particular stochastic process for which SGD can be extended to a continuous-time limit that we call stochastic gradient flow. In the full-batch limit we recover the standard gradient flow. We apply dynamical mean-field theory from statistical physics to track the dynamics of the algorithm in the high-dimensional limit via a self-consistent stochastic process. We explore the performance of the algorithm as a function of control parameters shedding light on how it navigates the loss landscape
    corecore