2 research outputs found

    Neural signature kernels as infinite-width-depth-limits of controlled ResNets

    Full text link
    Motivated by the paradigm of reservoir computing, we consider randomly initialized controlled ResNets defined as Euler-discretizations of neural controlled differential equations (Neural CDEs). We show that in the infinite-width-then-depth limit and under proper scaling, these architectures converge weakly to Gaussian processes indexed on some spaces of continuous paths and with kernels satisfying certain partial differential equations (PDEs) varying according to the choice of activation function. In the special case where the activation is the identity, we show that the equation reduces to a linear PDE and the limiting kernel agrees with the signature kernel of Salvi et al. (2021). In this setting, we also show that the width-depth limits commute. We name this new family of limiting kernels neural signature kernels. Finally, we show that in the infinite-depth regime, finite-width controlled ResNets converge in distribution to Neural CDEs with random vector fields which, depending on whether the weights are shared across layers, are either time-independent and Gaussian or behave like a matrix-valued Brownian motion

    Signature, Randomized Signature and Kernel Methods on Path Spaces.

    Full text link
    This thesis explores the use of Signatures in Machine Learning through the lens of Kernel Methods. Signatures are central objects in the theory of Rough Paths which have found wide application in the Machine Learning domain promising to be canonical feature extractors on path spaces. Related Kernel Methods have recently received particular attention being easily computable using off-the-shelf PDE solvers. Randomized Signatures behave like signature but are much easier to compute being solutions of simple, random, and finite dimensional Controlled Differential Equations. They present promising results even though some aspects have yet to be rigorously studied. This work is divided in three main parts: 1 - We introduce the mathematics behind the paradigm of Kernel Learning. 2 - We frame Signatures in the Machine Learning context and analyze related Kernel Methods: Signature Kernels (SK). 3 - We try to rigorously define Randomized Signatures and prove central results. We then proceed to study the properties of the novel Randomized Signature Kernel (rSK) and end with the proof of a conjecture relating rSK and SK
    corecore