1 research outputs found

    An Elementary Approach to Convergence Guarantees of Optimization Algorithms for Deep Networks

    Full text link
    We present an approach to obtain convergence guarantees of optimization algorithms for deep networks based on elementary arguments and computations. The convergence analysis revolves around the analytical and computational structures of optimization oracles central to the implementation of deep networks in machine learning software. We provide a systematic way to compute estimates of the smoothness constants that govern the convergence behavior of first-order optimization algorithms used to train deep networks. A diverse set of example components and architectures arising in modern deep networks intersperse the exposition to illustrate the approach.Comment: The changes from v1 to v2 include i) slightly more general results; ii) slightly more concise proofs; iii) highway and residual networks; iv) implicitly defined network layers; v) additional algorithm boxes and illustration figure
    corecore