13,244 research outputs found

    SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

    Get PDF
    Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. However, they both impose constraints on the models: Normalizing flows use bijective transformations to model densities whereas VAEs learn stochastic transformations that are non-invertible and thus typically do not provide tractable estimates of the marginal likelihood. In this paper, we introduce SurVAE Flows: A modular framework of composable transformations that encompasses VAEs and normalizing flows. SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood. We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows. Finally, we introduce common operations such as the max value, the absolute value, sorting and stochastic permutation as composable layers in SurVAE Flows

    Stochastic normalizing flows for lattice field theory

    Full text link
    Stochastic normalizing flows are a class of deep generative models that combine normalizing flows with Monte Carlo updates and can be used in lattice field theory to sample from Boltzmann distributions. In this proceeding, we outline the construction of these hybrid algorithms, pointing out that the theoretical background can be related to Jarzynski's equality, a non-equilibrium statistical mechanics theorem that has been successfully used to compute free energy in lattice field theory. We conclude with examples of applications to the two-dimensional Ï•4\phi^4 field theory.Comment: 9 pages, 4 figures, contribution for the 39th International Symposium on Lattice Field Theory, 8th-13th August, 2022, Bonn, German

    Stochastic normalizing flows as non-equilibrium transformations

    Get PDF
    Normalizing flows are a class of deep generative models that provide a promising route to sample lattice field theories more efficiently than conventional Monte Carlo simulations. In this work we show that the theoretical framework of stochastic normalizing flows, in which neural-network layers are combined with Monte Carlo updates, is the same that underlies out-of-equilibrium simulations based on Jarzynski's equality, which have been recently deployed to compute free-energy differences in lattice gauge theories. We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.Comment: 1+28 pages, 8 figures; v2: 1+29 pages, 8 figures, added references, discussion in section 4 improved; v3: 1+31 pages, 9 figures, added references, discussion in section 4 expanded, matches published versio

    Towards probabilistic Weather Forecasting with Conditioned Spatio-Temporal Normalizing Flows

    Full text link
    Generative normalizing flows are able to model multimodal spatial distributions, and they have been shown to model temporal correlations successfully as well. These models provide several benefits over other types of generative models due to their training stability, invertibility and efficiency in sampling and inference. This makes them a suitable candidate for stochastic spatio-temporal prediction problems, which are omnipresent in many fields of sciences, such as earth sciences, astrophysics or molecular sciences. In this paper, we present conditional normalizing flows for stochastic spatio-temporal modelling. The method is evaluated on the task of daily temperature and hourly geopotential map prediction from ERA5 datasets. Experiments show that our method is able to capture spatio-temporal correlations and extrapolates well beyond the time horizon used during training
    • …
    corecore