881 research outputs found

    The effective temperature

    Get PDF
    This review presents the effective temperature notion as defined from the deviations from the equilibrium fluctuation-dissipation theorem in out of equilibrium systems with slow dynamics. The thermodynamic meaning of this quantity is discussed in detail. Analytic, numeric and experimental measurements are surveyed. Open issues are mentioned.Comment: 58 page

    Bayesian Inference for partially observed SDEs Driven by Fractional Brownian Motion

    Full text link
    We consider continuous-time diffusion models driven by fractional Brownian motion. Observations are assumed to possess a non-trivial likelihood given the latent path. Due to the non-Markovianity and high-dimensionality of the latent paths, estimating posterior expectations is a computationally challenging undertaking. We present a reparameterization framework based on the Davies and Harte method for sampling stationary Gaussian processes and use this framework to construct a Markov chain Monte Carlo algorithm that allows computationally efficient Bayesian inference. The Markov chain Monte Carlo algorithm is based on a version of hybrid Monte Carlo that delivers increased efficiency when applied on the high-dimensional latent variables arising in this context. We specify the methodology on a stochastic volatility model allowing for memory in the volatility increments through a fractional specification. The methodology is illustrated on simulated data and on the S&P500/VIX time series and is shown to be effective. Contrary to a long range dependence attribute of such models often assumed in the literature, with Hurst parameter larger than 1/2, the posterior distribution favours values smaller than 1/2, pointing towards medium range dependence

    Energy Discrepancies: A Score-Independent Loss for Energy-Based Models

    Full text link
    Energy-based models are a simple yet powerful class of probabilistic models, but their widespread adoption has been limited by the computational burden of training them. We propose a novel loss function called Energy Discrepancy (ED) which does not rely on the computation of scores or expensive Markov chain Monte Carlo. We show that ED approaches the explicit score matching and negative log-likelihood loss under different limits, effectively interpolating between both. Consequently, minimum ED estimation overcomes the problem of nearsightedness encountered in score-based estimation methods, while also enjoying theoretical guarantees. Through numerical experiments, we demonstrate that ED learns low-dimensional data distributions faster and more accurately than explicit score matching or contrastive divergence. For high-dimensional image data, we describe how the manifold hypothesis puts limitations on our approach and demonstrate the effectiveness of energy discrepancy by training the energy-based model as a prior of a variational decoder model

    A Survey on Generative Diffusion Model

    Full text link
    Deep learning shows excellent potential in generation tasks thanks to deep latent representation. Generative models are classes of models that can generate observations randomly concerning certain implied parameters. Recently, the diffusion Model has become a rising class of generative models by its power-generating ability. Nowadays, great achievements have been reached. More applications except for computer vision, speech generation, bioinformatics, and natural language processing are to be explored in this field. However, the diffusion model has its genuine drawback of a slow generation process, single data types, low likelihood, and the inability for dimension reduction. They are leading to many enhanced works. This survey makes a summary of the field of the diffusion model. We first state the main problem with two landmark works -- DDPM and DSM, and a unified landmark work -- Score SDE. Then, we present improved techniques for existing problems in the diffusion-based model field, including speed-up improvement For model speed-up improvement, data structure diversification, likelihood optimization, and dimension reduction. Regarding existing models, we also provide a benchmark of FID score, IS, and NLL according to specific NFE. Moreover, applications with diffusion models are introduced including computer vision, sequence modeling, audio, and AI for science. Finally, there is a summarization of this field together with limitations \& further directions. The summation of existing well-classified methods is in our Github:https://github.com/chq1155/A-Survey-on-Generative-Diffusion-Model
    • …
    corecore