107 research outputs found

    Integrated Mechanisms of Anticipation and Rate-of-Change Computations in Cortical Circuits

    Get PDF
    Local neocortical circuits are characterized by stereotypical physiological and structural features that subserve generic computational operations. These basic computations of the cortical microcircuit emerge through the interplay of neuronal connectivity, cellular intrinsic properties, and synaptic plasticity dynamics. How these interacting mechanisms generate specific computational operations in the cortical circuit remains largely unknown. Here, we identify the neurophysiological basis of both the rate of change and anticipation computations on synaptic inputs in a cortical circuit. Through biophysically realistic computer simulations and neuronal recordings, we show that the rate-of-change computation is operated robustly in cortical networks through the combination of two ubiquitous brain mechanisms: short-term synaptic depression and spike-frequency adaptation. We then show how this rate-of-change circuit can be embedded in a convergently connected network to anticipate temporally incoming synaptic inputs, in quantitative agreement with experimental findings on anticipatory responses to moving stimuli in the primary visual cortex. Given the robustness of the mechanism and the widespread nature of the physiological machinery involved, we suggest that rate-of-change computation and temporal anticipation are principal, hard-wired functions of neural information processing in the cortical microcircuit

    Non-stationary filtered shot noise processes and applications to neuronal membranes

    Full text link
    Filtered shot noise processes have proven to be very effective in modelling the evolution of systems exposed to stochastic shot noise sources, and have been applied to a wide variety of fields ranging from electronics through biology. In particular, they can model the membrane potential Vm of neurons driven by stochastic input, where these filtered processes are able to capture the non-stationary characteristics of Vm fluctuations in response to pre-synaptic input with variable rate. In this paper, we apply the general framework of Poisson Point Processes transformations to analyse these systems in the general case of variable input rate. We obtain exact analytic expressions, and very accurate approximations, for the joint cumulants of filtered shot noise processes with multiplicative noise. These general results are then applied to a model of neuronal membranes subject to conductance shot noise with continuously variable rate of pre-synaptic spikes. We propose very effective approximations for the time evolution of Vm distribution and simple method to estimate the pre-synaptic rate from a small number of Vm traces. This work opens the perspective of obtaining analytic access to important statistical properties of conductance-based neuronal models such as the the first passage time.Comment: 18 pages, 13 figure

    Dynamique stochastique non-stationnaire de la membrane neuronale

    Get PDF
    Neurons interact through their membrane potential that generally has a complex time evolution due to numerous irregular synaptic inputs received. This complex time evolution is best described in probabilistic terms due to this irregular or "noisy" activity. The time evolution of the membrane potential is therefore both stochastic and deterministic: it is stochastic since it is driven by random input arrival times, but also deterministic, since subjecting a biological neuron to the same sequence of input arrival times often results in very similar membrane potential traces. In this thesis, we investigated key statistical properties of a simplified neuron model under nonstationary input from other neurons that results in nonstationary evolution of membrane potential statistics. We considered a passive neuron model without spiking mechanism that is driven by input currents or conductances in the form of shot noise processes. Under such input, membrane potential fluctuations can be modeled as filtered shot noise currents or conductances. We analyzed the statistical properties of these filtered processes in the framework of Poisson Point Processes transformations. The key idea is to express filtered shot noise as a transformation of random input arrival times and to apply the properties of these transformations to derive its nonstationary statistics. Using this formalism we derive exact analytical expressions, and useful approximations, for the mean and joint cumulants of the filtered process in the general case of variable input rate. This work opens many perspectives for analyzing neurons under in vivo conditions, in the presence of intense and noisy synaptic inputs.Les neurones interagissent à travers leur potentiel de membrane qui a en général une évolution temporelle complexe due aux nombreuses entrées synaptiques irrégulières reçues. Cette évolution est mieux décrite en termes probabilistes, en raison de ces entrées irrégulières ou «bruit synaptique». L'évolution temporelle du potentiel de membrane est stochastique mais aussi déterministe: stochastique, car conduite par des entrées synaptiques qui arrivent de façon aléatoire dans le temps, et déterministe, car un neurone biologique a une évolution temporelle très similaire quand soumis à une même séquence d'entrées synaptiques. Nous étudions les propriétés statistiques d'un modèle simplifié de neurone soumis à des entrées à taux variable d'où en résulte l'évolution non-stationnaire du potentiel de membrane. Nous considérons un modèle passif de membrane neuronale, sans mécanisme de décharge neuronale, soumis à des entrées à courant ou à conductance sous la forme d'un processus de «shot noise». Les fluctuations du potentiel de membrane sont aussi modélisées par un processus stochastique similaire, de «shot noise» filtré. Nous avons analysé les propriétés statistiques de ces processus dans le cadre des transformations de processus ponctuels de Poisson. Des propriétés de ces transformations sont dérivées les statistiques non-stationnaires du processus. Nous obtenons ainsi des expressions analytiques exactes pour les moments et cumulants du processus filtré dans le cas général des taux d'entrée variables. Ce travail ouvre de nombreuses perspectives pour l'analyse de neurones dans les conditions in vivo, en présence d'entrées synaptiques intenses et bruitées

    The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields

    Get PDF
    The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences

    Measurement of excitation-inhibition ratio in autism spectrum disorder using critical brain dynamics

    Get PDF
    cited By 0Balance between excitation (E) and inhibition (I) is a key principle for neuronal network organization and information processing. Consistent with this notion, excitation-inhibition imbalances are considered a pathophysiological mechanism in many brain disorders including autism spectrum disorder (ASD). However, methods to measure E/I ratios in human brain networks are lacking. Here, we present a method to quantify a functional E/I ratio (fE/I) from neuronal oscillations, and validate it in healthy subjects and children with ASD. We define structural E/I ratio in an in silico neuronal network, investigate how it relates to power and long-range temporal correlations (LRTC) of the network's activity, and use these relationships to design the fE/I algorithm. Application of this algorithm to the EEGs of healthy adults showed that fE/I is balanced at the population level and is decreased through GABAergic enforcement. In children with ASD, we observed larger fE/I variability and stronger LRTC compared to typically developing children (TDC). Interestingly, visual grading for EEG abnormalities that are thought to reflect E/I imbalances revealed elevated fE/I and LRTC in ASD children with normal EEG compared to TDC or ASD with abnormal EEG. We speculate that our approach will help understand physiological heterogeneity also in other brain disorders.Peer reviewe

    Investigation and Modelling of Fetal Sheep Maturation

    Get PDF
    In this thesis, I study the maturational changes of the fetal sheep ECoG (electrocorticogram) in its third-trimester of gestation (95-140 days of gestation), investigate three continuum models for electrical behaviour of the cortex, and tune the parameters in one of these models to generate the discontinuous EEG waves in the immature cortex. Visual inspection of the ECoG time-series shows that the third-trimester of fetal sheep is comprised of two stages: early third-trimester characterised by bursting activity separated by silent intervals, and late third-trimester with well-defined SWS (slow wave sleep) and REM (rapid eye movement) sleep states. For the late third-trimester, the results of power, correlation time, and SVD (singular value decomposition) entropy analysis demonstrate that the sleep state change is a cortical phase transition—with SWS-to-REM transition being a first-order transition, and REM-to-SWS second-order. Further analyses by correlation time, SVD entropy, and spectral edge frequency display that the differentiation of the two distinct SWS and REM sleep states occurs at about 125 dGA (day gestational age). Spectral analysis divides the third-trimester into four stages in terms of the frequency and amplitude variations of the major resonances. Spindle-like resonances only occur in the first stage. A power surge is observed immediately prior to the emergence of the two sleep states. Most significant changes of the spectrum occur during the fourth stage for both SWS (in amplitude) and REM (in frequency) sleep states. For the modelling of the immature cortex, different theoretical descriptions of cortical behaviour are investigated, including the ccf (cortical column field) model of J. J. Wright, and the Waikato cortical model. For the ccf model at centimetric scale, the time-series, fluctuation power, power law relation, gamma oscillation, phase relation between excitatory and inhibitory elements, power spectral density, and spatial Fourier spectrum are quantified from numerical simulations. From these simulations, I determined that the physiologically sophisticated ccf model is too large and unwieldy for easy tuning to match the electrical response of the immature cortex. The Waikato near-far fast-soma model is constructed by incorporating the back-propagation effect of the action potential into the Waikato fast-soma model, state equations are listed and stability prediction are performed by varying the gap junction diffusion strength, subcortical drive, and the rate constants of the near- and far-dendritic tree. In the end, I selected the classic and simpler Waikato slow-soma mean-field model to use for my immature cortex simulations. Model parameters are customised based on the physiology of the immature cortex, including GABA (an inhibitory neurotransmitter in adult) excitatory effect, number of synaptic connections, and rate constants of the IPSPs (inhibitory postsynaptic potential). After hyperpolarising the neuron resting voltage sufficiently to cause the immature inhibitory neuron to act as an excitatory agent, I alter the rate constant of the IPSP, and study the stability of the immature cortex. The bursting activity and quiet states of the discontinuous EEG are simulated and the gap junction diffusion effect in the immature cortex is also examined. For a rate constant of 18.6 s-1, slow oscillations in the quiet states are generated, and for rate constant of 25 s-1, a possible cortical network oscillation emerges. As far as I know, this is the first time that the GABA excitatory effect has been integrated into a mean-field cortical model and the discontinuous EEG wave successfully simulated in a qualitative way

    Mesoscale Systems, Finite Size Effects, and Balanced Neural Networks

    Get PDF
    Cortical populations are typically in an asynchronous state, sporadically interrupted by brief epochs of coordinated population activity. Current cortical models are at a loss to explain this combination of states. At one extreme are network models where recurrent in- hibition dynamically stabilizes an asynchronous low activity state. While these networks are widely used they cannot produce the coherent population-wide activity that is reported in a variety of datasets. At the other extreme are models where short term synaptic depression between excitatory neurons can generate the epochs of population-wide activity. However, in these networks inhibition plays only a perfunctory role in network stability, which is at odds with many reports across cortex. In this study we analyze spontaneously active in vitro preparations of primary auditory cortex that show dynamics that are emblematic of this mix- ture of states. To capture this complex population activity we consider models where large excitation is balanced by recurrent inhibition yet we include short term synaptic depression dynamics of the excitatory connections. This model gives very rich nonlinear behavior that mimics the core features of the in vitro data, including the possibility of low frequency (2- 12 Hz) rhythmic dynamics within population events. Our study extends balanced network models to account for nonlinear, population-wide correlated activity, thereby providing a critical step in a mechanistic theory of realistic cortical activity. We further investigate an extension of this model that l exhibits clearly non-Arrhenius behavior, whereby lower noise systems may exhibit faster escape from a stable state. We show that this behavior is due to the system size dependent vector field, intrinsically linking noise and dynamics

    Bridging structure and function with brain network modeling

    Get PDF
    High-throughput neuroimaging technology enables rapid acquisition of vast amounts of structural and functional data on multiple spatial and temporal scales. While novel methods to extract information from these data are continuously developed, there is no principled approach for the systematic integration of distinct experimental results into a common theoretical framework, yet. The central result of this dissertation is a biophysically-based framework for brain network modeling that links structural and functional data across scales and modalities and integrates them with dynamical systems theory. Specifically, the publications in this thesis i. introduce an automated pipeline that extracts structural and functional information from multimodal imaging data to construct and constrain brain models, ii. link whole-brain models with empirical EEG-fMRI (simultaneous electroencephalography and functional magnetic resonance imaging) data to integrate neural signals with simulated activity, iii. propose a framework for reverse-engineering neurophysiological dynamics and mechanisms underlying commonly observed features of neural activity, iv. document a software module that makes users acquainted with theory and practice of brain modeling, v. associate aging with structural and functional connectivity and vi. examine how parcellation size and short-range connectivity affect model dynamics. Taken together, these results form a novel approach that enables reverse-engineering of neurophysiological processes and mechanisms on the basis of biophysically-based brain models.Zusammenfassung Hochdurchsatzverfahren zur neuronalen Bildgebung ermöglichen die schnelle Erfassung großer Mengen an strukturellen und funktionellen Daten über verschiedenen räumlichen und zeitlichen Skalen. Obwohl ständig neue Methoden zur Verarbeitung der in diesen Daten enthaltenen Informationen entwickelt werden gibt es bisher kein systematisches Verfahren um experimentelle Ergebnisse in einem gemeinsamen theoretischen Rahmenwerk zu integrieren und zu verknüpfen. Das Hauptergebnis dieser Dissertation ist ein biophysikalisch basiertes Gehirn- Netzwerkmodell das strukturelle und funktionelle Daten über verschiedene Skalen und Modalitäten hinweg verknüpft und mit dynamischer Systemtheorie vereint. Die hier zusammengefassten Publikationen i. stellen eine automatische Software-Pipeline vor die strukturelle und funktionelle Informationen aus multimodalen Bilddaten extrahiert um Gehirnmodelle zu konstruieren und zu parametrisieren, ii. verknüpfen Ganzhi rnmodel le mi t empi r i schen EEG- fMRT ( s imul tane Elektroenzephalographie und funktionelle Magnetresonanztomographie) Daten um neuronale Signale mit simulierter Aktivität zu integrieren, iii. schlagen ein Rahmenwerk vor um neurophysiologische Dynamiken und Mechanismen die häufig beobachteten Eigenschaften neuronaler Aktivität zu Grunde liegen zu rekonstruieren, iv. dokumentieren ein Software-Modul das Benutzer mit Theorie und Praxis der Gehirnmodellierung vertraut macht, v. assoziieren Alterungsprozesse mit struktureller und funktioneller Konnektivität und vi. untersuchen wie Gehirn-Parzellierung und lokale Konnektivität die Modelldynamik beeinflussen. Zusammengenommen ergibt sich ein neuartiges Verfahren das die Rekonstruktion neurophysiologischer Prozesse und Mechanismen ermöglicht und mit dessen Hilfe neuronale Aktivität auf verschiedenen räumlichen und zeitlichen Skalen anhand biophysikalisch basierter Modelle vorhersagt werden kann
    corecore