research

Large Deviations for Nonlocal Stochastic Neural Fields

Abstract

We study the effect of additive noise on integro-differential neural field equations. In particular, we analyze an Amari-type model driven by a QQ-Wiener process and focus on noise-induced transitions and escape. We argue that proving a sharp Kramers' law for neural fields poses substanial difficulties but that one may transfer techniques from stochastic partial differential equations to establish a large deviation principle (LDP). Then we demonstrate that an efficient finite-dimensional approximation of the stochastic neural field equation can be achieved using a Galerkin method and that the resulting finite-dimensional rate function for the LDP can have a multi-scale structure in certain cases. These results form the starting point for an efficient practical computation of the LDP. Our approach also provides the technical basis for further rigorous study of noise-induced transitions in neural fields based on Galerkin approximations.Comment: 29 page

    Similar works