171,995 research outputs found

    Analytic function approximation by path norm regularized deep networks

    Full text link
    We provide an entropy bound for the spaces of path norm regularized neural networks with piecewise linear activation functions, such as the ReLU and the absolute value functions. This bound generalizes the known entropy bound for the spaces of linear functions on Rd\mathbb{R}^d. Keeping the path norm together with the depth, width and the weights of networks to have logarithmic dependence on 1/ε1/\varepsilon, we ε\varepsilon-approximate functions that are analytic on certain regions of Cd\mathbb{C}^d
    • …
    corecore