10 research outputs found

    A Conformal Fractional Derivative-based Leaky Integrate-and-Fire Neuron Model

    Get PDF
    Neuron model have been extensively studied and different models have been proposed. Nobel laureate Hodgkin-Huxley model is physiologically relevant and can demonstrate different neural behaviors, but it is mathematically complex. For this reason, simplified neuron models such as integrate-and-fire model and its derivatives are more popular in the literature to study neural populations. Lapicque’s integrate-and-fire model is proposed in 1907 and its leaky integrate-and-fire version is very popular due to its simplicity. In order to improve this simple model and capture different aspects of neurons, a variety of it have been proposed. Fractional order derivative-based neuron models are one of those varieties, which can show adaptation without necessitating additional differential equations. However, fractional-order derivatives could be computationally costly. Recently, a conformal fractional derivative (CFD) is suggested in literature. It is easy to understand and implement compared to the other methods. In this study, a CFD-based leaky integrate-and-fire neuron model is proposed. The model captures the adaptation in firing rate under sustained current injection. Results suggest that it could be used to easily and efficiently implement network models as well as to model different sensory afferents

    Chaotic oscillations in a map-based model of neural activity

    Full text link
    We propose a discrete time dynamical system (a map) as phenomenological model of excitable and spiking-bursting neurons. The model is a discontinuous two-dimensional map. We find condition under which this map has an invariant region on the phase plane, containing chaotic attractor. This attractor creates chaotic spiking-bursting oscillations of the model. We also show various regimes of other neural activities (subthreshold oscillations, phasic spiking etc.) derived from the proposed model

    Discrete Geometric Singular Perturbation Theory

    Full text link
    We propose a mathematical formalism for discrete multi-scale dynamical systems induced by maps which parallels the established geometric singular perturbation theory for continuous-time fast-slow systems. We identify limiting maps corresponding to both 'fast' and 'slow' iteration under the map. A notion of normal hyperbolicity is defined by a spectral gap requirement for the multipliers of the fast limiting map along a critical fixed-point manifold SS. We provide a set of Fenichel-like perturbation theorems by reformulating pre-existing results so that they apply near compact, normally hyperbolic submanifolds of SS. The persistence of the critical manifold SS, local stable/unstable manifolds Wlocs/u(S)W^{s/u}_{loc}(S) and foliations of Wlocs/u(S)W^{s/u}_{loc}(S) by stable/unstable fibers is described in detail. The practical utility of the resulting discrete geometric singular perturbation theory (DGSPT) is demonstrated in applications. First, we use DGSPT to identify singular geometry corresponding to excitability, relaxation, chaotic and non-chaotic bursting in a map-based neural model. Second, we derive results which relate the geometry and dynamics of fast-slow ODEs with non-trivial time-scale separation and their Euler-discretized counterpart. Finally, we show that fast-slow ODE systems with fast rotation give rise to fast-slow Poincar\'e maps, the geometry and dynamics of which can be described in detail using DGSPT.Comment: Updated to include minor corrections made during the review process (no major changes

    Complex dynamics in simplified neuronal models: reproducing Golgi cell electroresponsiveness

    Get PDF
    Brain neurons exhibit complex electroresponsive properties – including intrinsic subthreshold oscillations and pacemaking, resonance and phase-reset – which are thought to play a critical role in controlling neural network dynamics. Although these properties emerge from detailed representations of molecular-level mechanisms in “realistic” models, they cannot usually be generated by simplified neuronal models (although these may show spike-frequency adaptation and bursting). We report here that this whole set of properties can be generated by the extended generalized leaky integrate-and-fire (E-GLIF) neuron model. E-GLIF derives from the GLIF model family and is therefore mono-compartmental, keeps the limited computational load typical of a linear low-dimensional system, admits analytical solutions and can be tuned through gradient-descent algorithms. Importantly, E-GLIF is designed to maintain a correspondence between model parameters and neuronal membrane mechanisms through a minimum set of equations. In order to test its potential, E-GLIF was used to model a specific neuron showing rich and complex electroresponsiveness, the cerebellar Golgi cell, and was validated against experimental electrophysiological data recorded from Golgi cells in acute cerebellar slices. During simulations, E-GLIF was activated by stimulus patterns, including current steps and synaptic inputs, identical to those used for the experiments. The results demonstrate that E-GLIF can reproduce the whole set of complex neuronal dynamics typical of these neurons – including intensity-frequency curves, spike-frequency adaptation, post-inhibitory rebound bursting, spontaneous subthreshold oscillations, resonance, and phase-reset – providing a new effective tool to investigate brain dynamics in large-scale simulations

    Rhythmogenic and Premotor Functions of Dbx1 Interneurons in the Pre-Bötzinger Complex and Reticular Formation: Modeling and Simulation Studies

    Get PDF
    Breathing in mammals depends on rhythms that originate from the preBötzinger complex (preBötC) of the ventral medulla and a network of brainstem and spinal premotor neurons. The rhythm-generating core of the preBötC, as well as some premotor circuits, consists of interneurons derived from Dbx1-expressing precursors but the structure and function of these networks remain incompletely understood. We previously developed a cell-specific detection and laser ablation system to interrogate respiratory network structure and function in a slice model of breathing that retains the preBötC, premotor circuits, and the respiratory related hypoglossal (XII) motor nucleus such that in spontaneously rhythmic slices, cumulative ablation of Dbx1 preBötC neurons decreased XII motor output by half after only a few cell deletions, and then decelerated and terminated rhythmic function altogether as the tally increased. In contrast, cumulatively deleting Dbx1 premotor neurons decreased XII motor output monotonically, but did not affect frequency nor stop functionality regardless of the ablation tally. This dissertation presents several network modeling and cellular modeling studies that would further our understanding of how respiratory rhythm is generated and transmitted to the XII motor nucleus. First, we propose that cumulative deletions of Dbx1 preBötC neurons preclude rhythm by diminishing the amount of excitatory inward current or disturbing the process of recurrent excitation rather than structurally breaking down the topological network. Second, we establish a feasible configuration for neural circuits including an Erdős-Rényi preBötC network and a small-world reticular premotor network with interconnections following an anti-preferential attachment rule, which is the only configuration that produces consistent outcomes with previous experimental benchmarks. Furthermore, since the performance of neuronal network simulations is, to some extent, affected by the nature of the cellular model, we aim to develop a more realistic cellular model based on the one we adopted in previous network studies, which would account for some recent experimental findings on rhythmogenic preBötC neurons

    Nonlinear synchrony dynamics of neuronal bursters

    Get PDF
    We study the appearance of a novel phenomenon for coupled identical bursters: synchronized bursts where there are changes of spike synchrony within each burst. The examples we study are for normal form elliptic bursters where there is a periodic slow passage through a Bautin (codimension two degenerate Andronov-Hopf) bifurcation. This burster has a subcritical Andronov-Hopf bifurcation at the onset of repetitive spiking while the end of burst occurs via a fold limit cycle bifurcation. We study synchronization behavior of two Bautin-type elliptic bursters for a linear direct coupling scheme as well as demonstrating its presence in an approximation of gap-junction and synaptic coupling. We also find similar behaviour in system consisted of three and four Bautin-type elliptic bursters. We note that higher order terms in the normal form that do not affect the behavior of a single burster can be responsible for changes in synchrony pattern; more precisely, we find within-burst synchrony changes associated with a turning point in the spontaneous spiking frequency (frequency transition). We also find multiple synchrony changes in similar system by incorporating multiple frequency transitions. To explain the phenomenon we considered a burst-synchronized constrained model and a bifurcation analysis of the this reduced model shows the existence of the observed within-burst synchrony states. Within-burst synchrony change is also found in the system of mutually delaycoupled two Bautin-type elliptic bursters with a constant delay. The similar phenomenon is shown to exist in the mutually-coupled conductance-based Morris-Lecar neuronal system with an additional slow variable generating elliptic bursting. We also find within-burst synchrony change in linearly coupled FitzHugh-Rinzel 2 3 elliptic bursting system where the synchrony change occurs via a period doubling bifurcation. A bifurcation analysis of a burst-synchronized constrained system identifies the periodic doubling bifurcation in this case. We show emergence of spontaneous burst synchrony cluster in the system of three Hindmarsh-Rose square-wave bursters with nonlinear coupling. The system is found to change between the available cluster states depending on the stimulus. Lyapunov exponents of the burst synchrony states are computed from the corresponding variational system to probe the stability of the states. Numerical simulation also shows existence of burst synchrony cluster in the larger network of such system.Exeter Research Scholarship

    Sincronização, transições de fase, criticalidade e subamostragem em redes de neurônios formais

    Get PDF
    Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro de Ciências Físicas e Matemáticas. Programa de Pós-Graduação em FísicaPara estudar neurônios computacionalmente, pode-se escolher entre, pelo menos, duas abordagens diferentes: modelos biológicos do tipo Hodgkin e Huxley ou modelos formais (ex. o de Hindmarsh e Rose (HR), o de Kinouchi e Tragtenberg estendido (KTz), etc). Neurônios formais podem ser representados por equações diferenciais (ex. HR) ou por mapas, que são sistemas dinâmicos com variáveis de estado contínuas e dinâmica temporal discreta (ex. KTz). Poucos mapas foram propostos para descrever neurônios. Tais mapas provêem diversas vantagens computacionais, já que não há necessidade de ajustar nenhuma precisão arbitrária em variáveis de integração, o que leva a uma melhor performance nos cálculos e a resultados mais precisos. Acoplamos mapas KTz em redes regulares e complexas através de um mapa de sinapse química. Em redes regulares, verificamos que o modelo exibe diferentes tipos de sincronização (tais como sincronia em fase e em antifase e através de ondas que se propagam nas diagonais da rede); estudamos o efeito das sinapses na sincronização e nos intervalos entre disparos: sinapses muito lentas, em certas condições iniciais da rede, podem travar os neurônios em um comportamento de disparos rápidos, mesmo eles tendo sido ajustados em regime de bursting. A excitabilidade do neurônio KTz foi estudada. Redes regulares de neurônios KTz excitáveis apresentaram ondas espirais e mudança no intervalo dinâmico ao mudar o parâmetro de acoplamento. Redes regulares e complexas excitáveis com acoplamento homogêneo apresentaram transições de fase de primeira ordem. Propomos a adição de um ruído uniforme no acoplamento, o que torna as transições de fase contínuas e gera distribuições críticas de avalanches temporais e espaciais, apontando para um modelo criticamente auto-organizado, com expoentes ~ 1.6 e ~ 1.4, respectivamente. Estudamos a influência de alguns comportamentos dinâmicos dos neurônios na estabilidade das avalanches. Finalmente, analisamos os efeitos da subamostragem dos dados através de dois métodos, comparando as distribuições críticas de uma amostra completa com as de uma subamostra, ou amostra parcial, da rede. Constatamos que um dos métodos mantém a lei de potência com expoente ~ 1.35, enquanto o outro gera uma distribuição log-normal.To study neurons with computational tools, one may call upon, at least, two different approaches: Hodgkin-Huxley like neurons (i.e. biological models) or formal models (e.g. Hindmarsh-Rose (HR) model, extended Kinouchi-Tragtenberg model (KTz), etc). Formal neurons may be represented by differential equations (e.g. HR), or by maps, which are dynamical systems with continuous state variables and discrete time dynamics (e.g. KTz). A few maps had been proposed to describe neurons. Such maps provide one with a number of computational advantages, since there is no need to set any arbitrary precision on the integration variable, which leads to better performance in the calculations and to more precise results. We coupled KTz maps within regular lattices and complex networks through a chemical synapse map. In regular lattices, the model exhibits different kinds of synchronization (such as phase and antiphase synchronization and linear wave fronts propagating over the network's diagonals); we studied the effect of synapses in the synchronization patterns and in the interspike interval times: slow synapses, under certain network's initial conditions, can lock down neurons into fast spiking behavior, even though they had been set into a bursting regime. The excitability of KTz neurons was studied. Excitable regular lattices and complex networks subjected to homogeneous coupling presented first order phase transitions. We propose the addition of uniform noise in the coupling, transforming the transitions into continuous phase transitions and generating critical avalanches' distributions in time and space, pointing towards a self-organized critical model, with exponents ~ 1.6 and ~ 1.4, respectively. We studied the influence of some dynamical behaviors of the neurons over the stability of the avalanches. Finally, we analyzed the data subsampling effect by two different methods, comparing the critical distributions of a full sample with those of a subsample, or partial sample, of the network. We found that one of the methods keep the power-law shape with exponent ~ 1.35 whereas the other generates a log-normal distribution

    Modélisation de la consolidation de la mémoire dépendante de l'état d'activité du cerveau

    Full text link
    Our brains enable us to perform complex actions and respond quickly to the external world, thanks to transitions between different brain states that reflect the activity of interconnected neuronal populations. An intriguing example is the ever-present switch of brain activity that occurs while transitioning between periods of active and quiet waking. It involves transitions from small-amplitude, high-frequency brain oscillations to large-amplitude, low-frequency oscillations, accompanied by neuronal activity switches from tonic firing to bursting. The switch between these firing modes is regulated by neuromodulators and the inherent properties of neurons. Simultaneously, our brains have the ability to learn and form memories through persistent changes in the strength of the connections between neurons. This process is known as synaptic plasticity, where neurons strengthen or weaken connections based on their respective firing activity. While it is commonly believed that putting in more effort and time leads to better performance when memorizing new information, this thesis explores the hypothesis that taking occasional breaks and allowing the brain to rest during quiet waking periods may actually be beneficial. Using a computational approach, the thesis investigates the relationship between the transitions in brain states from active to quiet waking described by the neuronal switches from tonic firing to bursting, and synaptic plasticity on memory consolidation. To investigate this research question, we constructed neurons and circuits with the ability to switch between tonic firing and bursting using a conductance-based approach. In our first contribution, we focused on identifying the key neuronal property that enables robust switches, even in the presence of neuron and circuit heterogeneity. Through computational experiments and phase plane analysis, we demonstrated the significance of a distinct timescale separation between sodium and T-type calcium channel activation by comparing various models from the existing literature. Synaptic plasticity is studied to understand learning and memory consolidation. The second contribution involves a taxonomy of synaptic plasticity rules, investigating their compatibility with switches in neuronal activity, small neuronal variabilities, and neuromodulators. The third contribution reveals the evolution of synaptic weights during the transition from tonic firing in active waking to bursting in quiet waking. Combining bursting neurons with traditional synaptic plasticity rules using soft-bounds leads to a homeostatic reset, where synaptic weights converge to a fixed point regardless of the weights acquired during tonic firing. Strong weights depress, while weak weights potentiate until reaching a set point. This homeostatic mechanism is robust to neuron and circuit heterogeneity and the choice of synaptic plasticity rules. The reset is further exploited by neuromodulator-induced changes in synaptic rules, potentially supporting the Synaptic-Tagging and Capture hypothesis, where strong weights are tagged and converge to a high reset value during bursting. While burst-induced reset may cause forgetting of previous learning, it also restores synaptic weights and facilitates the formation of new memories. To exploit this homeostatic property, an innovative burst-dependent structural plasticity rule is developed to encode previous learning through long-lasting morphological changes. The proposed mechanism explains late-stage of Long-Term Potentiation, complementing traditional synaptic plasticity rules governing early-stage of Long-Term Potentiation. Switches to bursting enable neurons to consolidate synapses by creating new proteins and promoting synapse growth, while simultaneously restoring efficacy of postsynaptic receptors for new learning. The novel plasticity rule is validated by comparing it with traditional synaptic rules in various memory tasks. The results demonstrate that switches from tonic firing to bursting and the novel structural plasticity enhance learning and memory consolidation. In conclusion, this thesis utilizes computational models of biophysical neurons to provide evidence that the switches from tonic firing to bursting, reflecting the shift from active to quiet waking, play a crucial role in enhancing memory consolidation through structural plasticity. In essence, this thesis offers computational support for the significance of taking breaks and allowing our brains to rest in order to solidify our memories. These findings serve as motivation for collaborative experiments between computational and experimental neuroscience, fostering a deeper understanding of the biological mechanisms underlying brain-state-dependent memory consolidation. Furthermore, these insights have the potential to inspire advancements in machine learning algorithms by incorporating principles of neuronal activity switches
    corecore