4 research outputs found
Stable behaviour of infinitely wide deep neural networks
We consider fully connected feed-forward deep neural networks (NNs) where
weights and biases are independent and identically distributed as symmetric
centered stable distributions. Then, we show that the infinite wide limit of
the NN, under suitable scaling on the weights, is a stochastic process whose
finite-dimensional distributions are multivariate stable distributions. The
limiting process is referred to as the stable process, and it generalizes the
class of Gaussian processes recently obtained as infinite wide limits of NNs
(Matthews at al., 2018b). Parameters of the stable process can be computed via
an explicit recursion over the layers of the network. Our result contributes to
the theory of fully connected feed-forward deep NNs, and it paves the way to
expand recent lines of research that rely on Gaussian infinite wide limits.Comment: 25 pages, 3 figure