54 research outputs found

    Estimation of synaptic conductances in presence of nonlinear effects caused by subthreshold ionic currents

    Get PDF
    Subthreshold fluctuations in neuronal membrane potential traces contain nonlinear components, and employing nonlinear models might improve the statistical inference. We propose a new strategy to estimate synaptic conductances, which has been tested using in silico data and applied to in vivo recordings. The model is constructed to capture the nonlinearities caused by subthreshold activated currents, and the estimation procedure can discern between excitatory and inhibitory conductances using only one membrane potential trace. More precisely, we perform second order approximations of biophysical models to capture the subthreshold nonlinearities, resulting in quadratic integrate-and-fire models, and apply approximate maximum likelihood estimation where we only suppose that conductances are stationary in a 50–100 ms time window. The results show an improvement compared to existent procedures for the models tested here.Peer ReviewedPostprint (published version

    Nonlinear estimation of synaptic conductances via piecewise linear systems

    Get PDF
    This volume contains extended abstracts outlining selected talks and other selected presentations given by participants throughout theWe use the piecewise linear McKean model to present a proof-of-concept to address the estimation of synaptic conductances when a neuron is spiking. Using standard techniques of non-smooth dynamical systems, we obtain an approximation of the period in terms of the parameters of the system which allows to estimate the steady synaptic conductance of the spiking neuron. The method gives also fairly good estimations when the synaptic conductances vary slowly in timePeer ReviewedPostprint (published version

    Sequential estimation of intrinsic activity and synaptic input in single neurons by particle filtering with optimal importance density

    Get PDF
    This paper deals with the problem of inferring the signals and parameters that cause neural activity to occur. The ultimate challenge being to unveil brain’s connectivity, here we focus on a microscopic vision of the problem, where single neurons (potentially connected to a network of peers) are at the core of our study. The sole observation available are noisy, sampled voltage traces obtained from intracellular recordings. We design algorithms and inference methods using the tools provided by stochastic filtering that allow a probabilistic interpretation and treatment of the problem. Using particle filtering, we are able to reconstruct traces of voltages and estimate the time course of auxiliary variables. By extending the algorithm, through PMCMC methodology, we are able to estimate hidden physiological parameters as well, like intrinsic conductances or reversal potentials. Last, but not least, the method is applied to estimate synaptic conductances arriving at a target cell, thus reconstructing the synaptic excitatory/inhibitory input traces. Notably, the performance of these estimations achieve the theoretical lower bounds even in spiking regimes.Postprint (published version

    Estimation of synaptic conductance in the spiking regime for the McKean neuron model

    Get PDF
    In this work, we aim at giving a first proof of concept to address the estimation of synaptic conductances when a neuron is spiking, a complex inverse nonlinear problem which is an open challenge in neuroscience. Our approach is based on a simplified model of neuronal activity, namely, a piecewise linear version of the FitzHugh-Nagumo model. This simplified model allows precise knowledge of the nonlinear f-I curve by using standard techniques of nonsmooth dynamical systems. In the regular firing regime of the neuron model, we obtain an approximation of the period which, in addition, improves previous approximations given in the literature to date. By knowing both this expression of the period and the current applied to the neuron, and then solving an inverse problem with a unique solution, we are able to estimate the steady synaptic conductance of the cell's oscillatory activity. Moreover, the method gives also good estimations when the synaptic conductance varies slowly in time.Peer ReviewedPreprin

    Effects of short-term plasticity in UP-DOWN cortical dynamics

    Get PDF
    Neuronal dynamics are strongly influenced by short-term plasticity (STP), that is, changes in synaptic efficacy that occur on a short (from milliseconds to seconds) time scale. Depending on the brain areas considered, STP can be dominated by short-term depression (STD), short-term facilitation (STF), or both mechanisms can coexist simultaneously. These two plasticity mechanisms modulate particular patterns of electrophysiological activity characterized by alternating UP and DOWN states. In this work, we develop a network model made up of excitatory and inhibitory multi-compartment neurons endowed with both mechanisms (STD and STF), spatially arranged to emulate the connectivity circuitry observed experimentally in the visual cortex. Our results reveal that both depression and facilitation can be involved in the switching process between different activity patterns, from an alternation of UP and DOWN states (for relatively low levels of depression and high levels of facilitation) to an asynchronous firing regime (for relatively high levels of depression and low levels of facilitation). For STD and STF, we identify the critical levels of depression and facilitation that push the network into the different regimes. Furthermore, we also find that these critical levels separate different growth rates of the mean synaptic conductances of the whole network with respect to the depression levels. This latter data is paramount to understanding how excitation and inhibition are organized to generate different brain activity regimes. Finally, after observing the changes in the trajectories of excitatory and inhibitory instantaneous firing rates near these critical boundaries, we identify dynamic patterns that shed light on the type of bifurcations that should arise in a rate model for this complex network"AG has been funded by Catalan Research Agency (AGAUR) grant 2017-SGR-1049, by the Spanish Ministerio de Ciencia e InnovaciĂłn grant PID2021-122954-I00 and by the Spanish State Research Agency through the Severo Ochoa and MarĂ­a de Maeztu Program for Centers and Units of Excellence in R&D (CEX2020-001084-M)."Peer ReviewedPostprint (published version

    Electrical Compartmentalization in Neurons

    Get PDF
    The dendritic tree of neurons plays an important role in information processing in the brain. While it is thought that dendrites require independent subunits to perform most of their computations, it is still not understood how they compartmentalize into functional subunits. Here, we show how these subunits can be deduced from the properties of dendrites. We devised a formalism that links the dendritic arborization to an impedance-based tree graph and show how the topology of this graph reveals independent subunits. This analysis reveals that cooperativity between synapses decreases slowly with increasing electrical separation and thus that few independent subunits coexist. We nevertheless find that balanced inputs or shunting inhibition can modify this topology and increase the number and size of the subunits in a context-dependent manner. We also find that this dynamic recompartmentalization can enable branch-specific learning of stimulus features. Analysis of dendritic patch-clamp recording experiments confirmed our theoretical predictions.Peer reviewe

    Effects of intrinsic neuronal properties in neural dynamics

    Full text link
    Tesis doctoral inédita. Universidad Autónoma de Madrid, Escuela Politécnica Superior, septiembre de 201

    Analyses at microscopic, mesoscopic, and mean-field scales

    Get PDF
    Die Aktivität des Hippocampus im Tiefschlaf ist geprägt durch sharp wave-ripple Komplexe (SPW-R): kurze (50–100 ms) Phasen mit erhöhter neuronaler Aktivität, moduliert durch eine schnelle “Ripple”-Oszillation (140–220 Hz). SPW-R werden mit Gedächtniskonsolidierung in Verbindung gebracht, aber ihr Ursprung ist unklar. Sowohl exzitatorische als auch inhibitorische Neuronpopulationen könnten die Oszillation generieren. Diese Arbeit analysiert Ripple-Oszillationen in inhibitorischen Netzwerkmodellen auf mikro-, meso- und makroskopischer Ebene und zeigt auf, wie die Ripple-Dynamik von exzitatorischem Input, inhibitorischer Kopplungsstärke und dem Rauschmodell abhängt. Zuerst wird ein stark getriebenes Interneuron-Netzwerk mit starker, verzögerter Kopplung analysiert. Es wird eine Theorie entwickelt, die die Drift-bedingte Feuerdynamik im Mean-field Grenzfall beschreibt. Die Ripple-Frequenz und die Dynamik der Membranpotentiale werden analytisch als Funktion des Inputs und der Netzwerkparameter angenähert. Die Theorie erklärt, warum die Ripple-Frequenz im Verlauf eines SPW-R-Ereignisses sinkt (intra-ripple frequency accommodation, IFA). Weiterhin zeigt eine numerische Analyse, dass ein alternatives Modell, basierend auf einem transienten Störungseffekt in einer schwach gekoppelten Interneuron-Population, unter biologisch plausiblen Annahmen keine IFA erzeugen kann. IFA kann somit zur Modellauswahl beitragen und deutet auf starke, verzögerte inhibitorische Kopplung als plausiblen Mechanismus hin. Schließlich wird die Anwendbarkeit eines kürzlich entwickelten mesoskopischen Ansatzes für die effiziente Simulation von Ripples in endlich großen Netzwerken geprüft. Dabei wird das Rauschen nicht im Input der Neurone beschrieben, sondern als stochastisches Feuern entsprechend einer Hazard-Rate. Es wird untersucht, wie die Wahl des Hazards die dynamische Suszeptibilität einzelner Neurone, und damit die Ripple-Dynamik in rekurrenten Interneuron-Netzwerken beeinflusst.Hippocampal activity during sleep or rest is characterized by sharp wave-ripples (SPW-Rs): transient (50–100 ms) periods of elevated neuronal activity modulated by a fast oscillation — the ripple (140–220 Hz). SPW-Rs have been linked to memory consolidation, but their generation mechanism remains unclear. Multiple potential mechanisms have been proposed, relying on excitation and/or inhibition as the main pacemaker. This thesis analyzes ripple oscillations in inhibitory network models at micro-, meso-, and macroscopic scales and elucidates how the ripple dynamics depends on the excitatory drive, inhibitory coupling strength, and the noise model. First, an interneuron network under strong drive and strong coupling with delay is analyzed. A theory is developed that captures the drift-mediated spiking dynamics in the mean-field limit. The ripple frequency as well as the underlying dynamics of the membrane potential distribution are approximated analytically as a function of the external drive and network parameters. The theory explains why the ripple frequency decreases over the course of an event (intra-ripple frequency accommodation, IFA). Furthermore, numerical analysis shows that an alternative inhibitory ripple model, based on a transient ringing effect in a weakly coupled interneuron population, cannot account for IFA under biologically realistic assumptions. IFA can thus guide model selection and provides new support for strong, delayed inhibitory coupling as a mechanism for ripple generation. Finally, a recently proposed mesoscopic integration scheme is tested as a potential tool for the efficient numerical simulation of ripple dynamics in networks of finite size. This approach requires a switch of the noise model, from noisy input to stochastic output spiking mediated by a hazard function. It is demonstrated how the choice of a hazard function affects the linear response of single neurons and therefore the ripple dynamics in a recurrent interneuron network
    • …
    corecore