552 research outputs found

    Efficiency characterization of a large neuronal network: a causal information approach

    Full text link
    When inhibitory neurons constitute about 40% of neurons they could have an important antinociceptive role, as they would easily regulate the level of activity of other neurons. We consider a simple network of cortical spiking neurons with axonal conduction delays and spike timing dependent plasticity, representative of a cortical column or hypercolumn with large proportion of inhibitory neurons. Each neuron fires following a Hodgkin-Huxley like dynamics and it is interconnected randomly to other neurons. The network dynamics is investigated estimating Bandt and Pompe probability distribution function associated to the interspike intervals and taking different degrees of inter-connectivity across neurons. More specifically we take into account the fine temporal ``structures'' of the complex neuronal signals not just by using the probability distributions associated to the inter spike intervals, but instead considering much more subtle measures accounting for their causal information: the Shannon permutation entropy, Fisher permutation information and permutation statistical complexity. This allows us to investigate how the information of the system might saturate to a finite value as the degree of inter-connectivity across neurons grows, inferring the emergent dynamical properties of the system.Comment: 26 pages, 3 Figures; Physica A, in pres

    Multiscale relevance and informative encoding in neuronal spike trains

    Get PDF
    Neuronal responses to complex stimuli and tasks can encompass a wide range of time scales. Understanding these responses requires measures that characterize how the information on these response patterns are represented across multiple temporal resolutions. In this paper we propose a metric -- which we call multiscale relevance (MSR) -- to capture the dynamical variability of the activity of single neurons across different time scales. The MSR is a non-parametric, fully featureless indicator in that it uses only the time stamps of the firing activity without resorting to any a priori covariate or invoking any specific structure in the tuning curve for neural activity. When applied to neural data from the mEC and from the ADn and PoS regions of freely-behaving rodents, we found that neurons having low MSR tend to have low mutual information and low firing sparsity across the correlates that are believed to be encoded by the region of the brain where the recordings were made. In addition, neurons with high MSR contain significant information on spatial navigation and allow to decode spatial position or head direction as efficiently as those neurons whose firing activity has high mutual information with the covariate to be decoded and significantly better than the set of neurons with high local variations in their interspike intervals. Given these results, we propose that the MSR can be used as a measure to rank and select neurons for their information content without the need to appeal to any a priori covariate.Comment: 38 pages, 16 figure

    Models wagging the dog: are circuits constructed with disparate parameters?

    Get PDF
    In a recent article, Prinz, Bucher, and Marder (2004) addressed the fundamental question of whether neural systems are built with a fixed blueprint of tightly controlled parameters or in a way in which properties can vary largely from one individual to another, using a database modeling approach. Here, we examine the main conclusion that neural circuits indeed are built with largely varying parameters in the light of our own experimental and modeling observations. We critically discuss the experimental and theoretical evidence, including the general adequacy of database approaches for questions of this kind, and come to the conclusion that the last word for this fundamental question has not yet been spoken

    The what and where of adding channel noise to the Hodgkin-Huxley equations

    Get PDF
    One of the most celebrated successes in computational biology is the Hodgkin-Huxley framework for modeling electrically active cells. This framework, expressed through a set of differential equations, synthesizes the impact of ionic currents on a cell's voltage -- and the highly nonlinear impact of that voltage back on the currents themselves -- into the rapid push and pull of the action potential. Latter studies confirmed that these cellular dynamics are orchestrated by individual ion channels, whose conformational changes regulate the conductance of each ionic current. Thus, kinetic equations familiar from physical chemistry are the natural setting for describing conductances; for small-to-moderate numbers of channels, these will predict fluctuations in conductances and stochasticity in the resulting action potentials. At first glance, the kinetic equations provide a far more complex (and higher-dimensional) description than the original Hodgkin-Huxley equations. This has prompted more than a decade of efforts to capture channel fluctuations with noise terms added to the Hodgkin-Huxley equations. Many of these approaches, while intuitively appealing, produce quantitative errors when compared to kinetic equations; others, as only very recently demonstrated, are both accurate and relatively simple. We review what works, what doesn't, and why, seeking to build a bridge to well-established results for the deterministic Hodgkin-Huxley equations. As such, we hope that this review will speed emerging studies of how channel noise modulates electrophysiological dynamics and function. We supply user-friendly Matlab simulation code of these stochastic versions of the Hodgkin-Huxley equations on the ModelDB website (accession number 138950) and http://www.amath.washington.edu/~etsb/tutorials.html.Comment: 14 pages, 3 figures, review articl
    • …
    corecore