3,064 research outputs found

    On Dynamics of Integrate-and-Fire Neural Networks with Conductance Based Synapses

    Get PDF
    We present a mathematical analysis of a networks with Integrate-and-Fire neurons and adaptive conductances. Taking into account the realistic fact that the spike time is only known within some \textit{finite} precision, we propose a model where spikes are effective at times multiple of a characteristic time scale δ\delta, where δ\delta can be \textit{arbitrary} small (in particular, well beyond the numerical precision). We make a complete mathematical characterization of the model-dynamics and obtain the following results. The asymptotic dynamics is composed by finitely many stable periodic orbits, whose number and period can be arbitrary large and can diverge in a region of the synaptic weights space, traditionally called the "edge of chaos", a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a one-to-one correspondence between the membrane potential trajectories and the raster plot. This shows that the neural code is entirely "in the spikes" in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy, providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and Integrate-and-Fire models and conductance based models. The present study considers networks with constant input, and without time-dependent plasticity, but the framework has been designed for both extensions.Comment: 36 pages, 9 figure

    Sensitivity analysis of oscillator models in the space of phase-response curves: Oscillators as open systems

    Full text link
    Oscillator models are central to the study of system properties such as entrainment or synchronization. Due to their nonlinear nature, few system-theoretic tools exist to analyze those models. The paper develops a sensitivity analysis for phase-response curves, a fundamental one-dimensional phase reduction of oscillator models. The proposed theoretical and numerical analysis tools are illustrated on several system-theoretic questions and models arising in the biology of cellular rhythms

    An information theoretic approach to the functional classification of neurons

    Get PDF
    A population of neurons typically exhibits a broad diversity of responses to sensory inputs. The intuitive notion of functional classification is that cells can be clustered so that most of the diversity is captured in the identity of the clusters rather than by individuals within clusters. We show how this intuition can be made precise using information theory, without any need to introduce a metric on the space of stimuli or responses. Applied to the retinal ganglion cells of the salamander, this approach recovers classical results, but also provides clear evidence for subclasses beyond those identified previously. Further, we find that each of the ganglion cells is functionally unique, and that even within the same subclass only a few spikes are needed to reliably distinguish between cells.Comment: 13 pages, 4 figures. To appear in Advances in Neural Information Processing Systems (NIPS) 1

    EEG-fMRI Based Information Theoretic Characterization of the Human Perceptual Decision System

    Get PDF
    The modern metaphor of the brain is that of a dynamic information processing device. In the current study we investigate how a core cognitive network of the human brain, the perceptual decision system, can be characterized regarding its spatiotemporal representation of task-relevant information. We capitalize on a recently developed information theoretic framework for the analysis of simultaneously acquired electroencephalography (EEG) and functional magnetic resonance imaging data (fMRI) (Ostwald et al. (2010), NeuroImage 49: 498–516). We show how this framework naturally extends from previous validations in the sensory to the cognitive domain and how it enables the economic description of neural spatiotemporal information encoding. Specifically, based on simultaneous EEG-fMRI data features from n = 13 observers performing a visual perceptual decision task, we demonstrate how the information theoretic framework is able to reproduce earlier findings on the neurobiological underpinnings of perceptual decisions from the response signal features' marginal distributions. Furthermore, using the joint EEG-fMRI feature distribution, we provide novel evidence for a highly distributed and dynamic encoding of task-relevant information in the human brain

    Detecting and Estimating Signals in Noisy Cable Structures, II: Information Theoretical Analysis

    Get PDF
    This is the second in a series of articles that seek to recast classical single-neuron biophysics in information-theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike initiation zone. On the other hand, we are interested in analyzing the amount of information lost about the signal in this process due to the presence of various noise sources distributed throughout the neuronal membrane. We use a stochastic version of the linear one-dimensional cable equation to derive closed-form expressions for the second-order moments of the fluctuations of the membrane potential associated with different membrane current noise sources: thermal noise, noise due to the random opening and closing of sodium and potassium channels, and noise due to the presence of “spontaneous” synaptic input. We consider two different scenarios. In the signal estimation paradigm, the time course of the membrane potential at a location on the cable is used to reconstruct the detailed time course of a random, band-limited current injected some distance away. Estimation performance is characterized in terms of the coding fraction and the mutual information. In the signal detection paradigm, the membrane potential is used to determine whether a distant synaptic event occurred within a given observation interval. In the light of our analytical results, we speculate that the length of weakly active apical dendrites might be limited by the information loss due to the accumulated noise between distal synaptic input sites and the soma and that the presence of dendritic nonlinearities probably serves to increase dendritic information transfer
    corecore