2,340 research outputs found
Universal Approximation with Deep Narrow Networks
The classical Universal Approximation Theorem holds for neural networks of
arbitrary width and bounded depth. Here we consider the natural `dual' scenario
for networks of bounded width and arbitrary depth. Precisely, let be the
number of inputs neurons, be the number of output neurons, and let
be any nonaffine continuous function, with a continuous nonzero derivative at
some point. Then we show that the class of neural networks of arbitrary depth,
width , and activation function , is dense in for with compact. This covers
every activation function possible to use in practice, and also includes
polynomial activation functions, which is unlike the classical version of the
theorem, and provides a qualitative difference between deep narrow networks and
shallow wide networks. We then consider several extensions of this result. In
particular we consider nowhere differentiable activation functions, density in
noncompact domains with respect to the -norm, and how the width may be
reduced to just for `most' activation functions.Comment: Accepted at COLT 202
Artificial Neural Networks
Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology and provide an overview of ANN modeling approach and its implementation methods.
- β¦