15 research outputs found
Diffusion maps embedding and transition matrix analysis of the large-scale flow structure in turbulent Rayleigh--B\'enard convection
By utilizing diffusion maps embedding and transition matrix analysis we
investigate sparse temperature measurement time-series data from
Rayleigh--B\'enard convection experiments in a cylindrical container of aspect
ratio between its diameter () and height (). We consider
the two cases of a cylinder at rest and rotating around its cylinder axis. We
find that the relative amplitude of the large-scale circulation (LSC) and its
orientation inside the container at different points in time are associated to
prominent geometric features in the embedding space spanned by the two dominant
diffusion-maps eigenvectors. From this two-dimensional embedding we can measure
azimuthal drift and diffusion rates, as well as coherence times of the LSC. In
addition, we can distinguish from the data clearly the single roll state (SRS),
when a single roll extends through the whole cell, from the double roll state
(DRS), when two counter-rotating rolls are on top of each other. Based on this
embedding we also build a transition matrix (a discrete transfer operator),
whose eigenvectors and eigenvalues reveal typical time scales for the stability
of the SRS and DRS as well as for the azimuthal drift velocity of the flow
structures inside the cylinder. Thus, the combination of nonlinear dimension
reduction and dynamical systems tools enables to gain insight into turbulent
flows without relying on model assumptions
Deep learning, stochastic gradient descent and diffusion maps
Stochastic gradient descent (SGD) is widely used in deep learning due to its
computational efficiency but a complete understanding of why SGD performs so
well remains a major challenge. It has been observed empirically that most
eigenvalues of the Hessian of the loss functions on the loss landscape of
over-parametrized deep networks are close to zero, while only a small number of
eigenvalues are large. Zero eigenvalues indicate zero diffusion along the
corresponding directions. This indicates that the process of minima selection
mainly happens in the relatively low-dimensional subspace corresponding to top
eigenvalues of the Hessian. Although the parameter space is very
high-dimensional, these findings seems to indicate that the SGD dynamics may
mainly live on a low-dimensional manifold. In this paper we pursue a truly data
driven approach to the problem of getting a potentially deeper understanding of
the high-dimensional parameter surface, and in particular of the landscape
traced out by SGD, by analyzing the data generated through SGD, or any other
optimizer for that matter, in order to possibly discovery (local)
low-dimensional representations of the optimization landscape. As our vehicle
for the exploration we use diffusion maps introduced by R. Coifman and
coauthors