Motivated by the paradigm of reservoir computing, we consider randomly
initialized controlled ResNets defined as Euler-discretizations of neural
controlled differential equations (Neural CDEs). We show that in the
infinite-width-then-depth limit and under proper scaling, these architectures
converge weakly to Gaussian processes indexed on some spaces of continuous
paths and with kernels satisfying certain partial differential equations (PDEs)
varying according to the choice of activation function. In the special case
where the activation is the identity, we show that the equation reduces to a
linear PDE and the limiting kernel agrees with the signature kernel of Salvi et
al. (2021). In this setting, we also show that the width-depth limits commute.
We name this new family of limiting kernels neural signature kernels. Finally,
we show that in the infinite-depth regime, finite-width controlled ResNets
converge in distribution to Neural CDEs with random vector fields which,
depending on whether the weights are shared across layers, are either
time-independent and Gaussian or behave like a matrix-valued Brownian motion