4,590 research outputs found

    Synchronization and Noise: A Mechanism for Regularization in Neural Systems

    Full text link
    To learn and reason in the presence of uncertainty, the brain must be capable of imposing some form of regularization. Here we suggest, through theoretical and computational arguments, that the combination of noise with synchronization provides a plausible mechanism for regularization in the nervous system. The functional role of regularization is considered in a general context in which coupled computational systems receive inputs corrupted by correlated noise. Noise on the inputs is shown to impose regularization, and when synchronization upstream induces time-varying correlations across noise variables, the degree of regularization can be calibrated over time. The proposed mechanism is explored first in the context of a simple associative learning problem, and then in the context of a hierarchical sensory coding task. The resulting qualitative behavior coincides with experimental data from visual cortex.Comment: 32 pages, 7 figures. under revie

    Stationary probability distributions of stochastic gradient descent and the success and failure of the diffusion approximation

    Get PDF
    In this thesis, Stochastic Gradient Descent (SGD), an optimization method originally popular due to its computational efficiency, is analyzed using Markov chain methods. We compute both numerically, and in some cases analytically, the stationary probability distributions (invariant measures) for the SGD Markov operator over all step sizes or learning rates. The stationary probability distributions provide insight into how the long-time behavior of SGD samples the objective function minimum. A key focus of this thesis is to provide a systematic study in one dimension comparing the exact SGD stationary distributions to the Fokker-Planck diffusion approximation equations —which are commonly used in the literature to characterize the SGD probability distribution in the limit of small step sizes/learning rates. While various error estimates for the diffusion approximation have recently been established, they are often in a weak sense and not in a strong maximum norm. Our study shows that the diffusion approximation converges with a slow rate in the maximum norm to the true stationary distribution. In addition to large quantitative errors, the exact SGD probability distribution exhibits fundamentally different behavior to the diffusion approximation: they can have compact or singular supports; and there can be multiple invariant measures for non-convex objective functions (when the diffusion approximation only has one). Finally, we use the Markov operator to establish additional results: (1) we show that for quadratic objective functions the SGD expected value is the objective function minimum for any step size. This has the practical implication that time average SGD solutions converge to the minimum even when the SGD iterates never reach or access the minimum. (2) We provide a simple approach to formally derive Fokker-Planck diffusion approximations using only basic calculus (e.g., integration by parts and Taylor expansions), which may be of interest to the engineering community. (3) We observe that the stationary distributions of the Markov operator lead to additional Fokker-Planck equations with simpler diffusion coefficients than what is currently in the literature
    • …
    corecore