641 research outputs found

    Low-discrepancy sequences for piecewise smooth functions on the two-dimensional torus

    Full text link
    We produce explicit low-discrepancy infinite sequences which can be used to approximate the integral of a smooth periodic function restricted to a convex domain with positive curvature in R^2. The proof depends on simultaneous diophantine approximation and a general version of the Erdos-Turan inequality.Comment: 14 pages, 2 figure

    Structured chaos shapes spike-response noise entropy in balanced neural networks

    Get PDF
    Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. A striking feature of these networks is that they are chaotic. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for the entropy of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complimented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows -- a phenomenon that depends on ``extensive chaos," as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.Comment: 9 pages, 5 figure
    • …
    corecore