63,782 research outputs found
Large-width functional asymptotics for deep Gaussian neural networks
In this paper, we consider fully connected feed-forward deep neural networks
where weights and biases are independent and identically distributed according
to Gaussian distributions. Extending previous results (Matthews et al.,
2018a;b; Yang, 2019) we adopt a function-space perspective, i.e. we look at
neural networks as infinite-dimensional random elements on the input space
. Under suitable assumptions on the activation function we show
that: i) a network defines a continuous Gaussian process on the input space
; ii) a network with re-scaled weights converges weakly to a
continuous Gaussian process in the large-width limit; iii) the limiting
Gaussian process has almost surely locally -H\"older continuous paths,
for . Our results contribute to recent theoretical studies on
the interplay between infinitely wide deep neural networks and Gaussian
processes by establishing weak convergence in function-space with respect to a
stronger metric
- …