149 research outputs found

    Data-Adaptive Probabilistic Likelihood Approximation for Ordinary Differential Equations

    Full text link
    Parameter inference for ordinary differential equations (ODEs) is of fundamental importance in many scientific applications. While ODE solutions are typically approximated by deterministic algorithms, new research on probabilistic solvers indicates that they produce more reliable parameter estimates by better accounting for numerical errors. However, many ODE systems are highly sensitive to their parameter values. This produces deep local minima in the likelihood function -- a problem which existing probabilistic solvers have yet to resolve. Here, we show that a Bayesian filtering paradigm for probabilistic ODE solution can dramatically reduce sensitivity to parameters by learning from the noisy ODE observations in a data-adaptive manner. Our method is applicable to ODEs with partially unobserved components and with arbitrary non-Gaussian noise. Several examples demonstrate that it is more accurate than existing probabilistic ODE solvers, and even in some cases than the exact ODE likelihood.Comment: 9 pages, 5 figure

    Bayesian Optimal Design for Ordinary Differential Equation Models

    Get PDF
    Bayesian optimal design is considered for experiments where it is hypothesised that the responses are described by the intractable solution to a system of non-linear ordinary differential equations (ODEs). Bayesian optimal design is based on the minimisation of an expected loss function where the expectation is with respect to all unknown quantities (responses and parameters). This expectation is typically intractable even for simple models before even considering the intractability of the ODE solution. New methodology is developed for this problem that involves minimising a smoothed stochastic approximation to the expected loss and using a state-of-the-art stochastic solution to the ODEs, by treating the ODE solution as an unknown quantity. The methodology is demonstrated on three illustrative examples and a real application involving estimating the properties of human placentas

    How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation

    Get PDF
    This paper addresses two questions in the context of neuronal networks dynamics, using methods from dynamical systems theory and statistical physics: (i) How to characterize the statistical properties of sequences of action potentials ("spike trains") produced by neuronal networks ? and; (ii) what are the effects of synaptic plasticity on these statistics ? We introduce a framework in which spike trains are associated to a coding of membrane potential trajectories, and actually, constitute a symbolic coding in important explicit examples (the so-called gIF models). On this basis, we use the thermodynamic formalism from ergodic theory to show how Gibbs distributions are natural probability measures to describe the statistics of spike trains, given the empirical averages of prescribed quantities. As a second result, we show that Gibbs distributions naturally arise when considering "slow" synaptic plasticity rules where the characteristic time for synapse adaptation is quite longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure
    corecore