223 research outputs found

    Attractor networks and memory replay of phase coded spike patterns

    Full text link
    We analyse the storage and retrieval capacity in a recurrent neural network of spiking integrate and fire neurons. In the model we distinguish between a learning mode, during which the synaptic connections change according to a Spike-Timing Dependent Plasticity (STDP) rule, and a recall mode, in which connections strengths are no more plastic. Our findings show the ability of the network to store and recall periodic phase coded patterns a small number of neurons has been stimulated. The self sustained dynamics selectively gives an oscillating spiking activity that matches one of the stored patterns, depending on the initialization of the network.Comment: arXiv admin note: text overlap with arXiv:1210.678

    Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks

    Get PDF
    We study the collective dynamics of a Leaky Integrate and Fire network in which precise relative phase relationship of spikes among neurons are stored, as attractors of the dynamics, and selectively replayed at differentctime scales. Using an STDP-based learning process, we store in the connectivity several phase-coded spike patterns, and we find that, depending on the excitability of the network, different working regimes are possible, with transient or persistent replay activity induced by a brief signal. We introduce an order parameter to evaluate the similarity between stored and recalled phase-coded pattern, and measure the storage capacity. Modulation of spiking thresholds during replay changes the frequency of the collective oscillation or the number of spikes per cycle, keeping preserved the phases relationship. This allows a coding scheme in which phase, rate and frequency are dissociable. Robustness with respect to noise and heterogeneity of neurons parameters is studied, showing that, since dynamics is a retrieval process, neurons preserve stablecprecise phase relationship among units, keeping a unique frequency of oscillation, even in noisy conditions and with heterogeneity of internal parameters of the units

    Storage of phase-coded patterns via STDP in fully-connected and sparse network: a study of the network capacity

    Get PDF
    We study the storage and retrieval of phase-coded patterns as stable dynamical attractors in recurrent neural networks, for both an analog and a integrate-and-fire spiking model. The synaptic strength is determined by a learning rule based on spike-time-dependent plasticity, with an asymmetric time window depending on the relative timing between pre- and post-synaptic activity. We store multiple patterns and study the network capacity. For the analog model, we find that the network capacity scales linearly with the network size, and that both capacity and the oscillation frequency of the retrieval state depend on the asymmetry of the learning time window. In addition to fully-connected networks, we study sparse networks, where each neuron is connected only to a small number z << N of other neurons. Connections can be short range, between neighboring neurons placed on a regular lattice, or long range, between randomly chosen pairs of neurons. We find that a small fraction of long range connections is able to amplify the capacity of the network. This imply that a small-world-network topology is optimal, as a compromise between the cost of long range connections and the capacity increase. Also in the spiking integrate and fire model the crucial result of storing and retrieval of multiple phase-coded patterns is observed. The capacity of the fully-connected spiking network is investigated, together with the relation between oscillation frequency of retrieval state and window asymmetry

    Neural Avalanches at the Critical Point between Replay and Non-Replay of Spatiotemporal Patterns

    Get PDF
    We model spontaneous cortical activity with a network of coupled spiking units, in which multiple spatio-temporal patterns are stored as dynamical attractors. We introduce an order parameter, which measures the overlap (similarity) between the activity of the network and the stored patterns. We find that, depending on the excitability of the network, different working regimes are possible. For high excitability, the dynamical attractors are stable, and a collective activity that replays one of the stored patterns emerges spontaneously, while for low excitability, no replay is induced. Between these two regimes, there is a critical region in which the dynamical attractors are unstable, and intermittent short replays are induced by noise. At the critical spiking threshold, the order parameter goes from zero to one, and its fluctuations are maximized, as expected for a phase transition (and as observed in recent experimental results in the brain). Notably, in this critical region, the avalanche size and duration distributions follow power laws. Critical exponents are consistent with a scaling relationship observed recently in neural avalanches measurements. In conclusion, our simple model suggests that avalanche power laws in cortical spontaneous activity may be the effect of a network at the critical point between the replay and non-replay of spatio-temporal patterns

    Dynamic Control of Network Level Information Processing through Cholinergic Modulation

    Full text link
    Acetylcholine (ACh) release is a prominent neurochemical marker of arousal state within the brain. Changes in ACh are associated with changes in neural activity and information processing, though its exact role and the mechanisms through which it acts are unknown. Here I show that the dynamic changes in ACh levels that are associated with arousal state control informational processing functions of networks through its effects on the degree of Spike-Frequency Adaptation (SFA), an activity dependent decrease in excitability, synchronizability, and neuronal resonance displayed by single cells. Using numerical modeling I develop mechanistic explanations for how control of these properties shift network activity from a stable high frequency spiking pattern to a traveling wave of activity. This transition mimics the change in brain dynamics seen between high ACh states, such as waking and Rapid Eye Movement (REM) sleep, and low ACh states such as Non-REM (NREM) sleep. A corresponding, and related, transition in network level memory recall is also occurs as ACh modulates neuronal SFA. When ACh is at its highest levels (waking) all memories are stably recalled, as ACh is decreased (REM) in the model weakly encoded memories destabilize while strong memories remain stable. In levels of ACh that match Slow Wave Sleep (SWS), no encoded memories are stably recalled. This results from a competition between SFA and excitatory input strength and provides a mechanism for neural networks to control the representation of underlying synaptic information. Finally I show that during the low ACh conditions, oscillatory conditions allow for external inputs to be properly stored in and recalled from synaptic weights. Taken together this work demonstrates that dynamic neuromodulation is critical for the regulation of information processing tasks in neural networks. These results suggest that ACh is capable of switching networks between two distinct information processing modes. Rate coding of information is facilitated during high ACh conditions and phase coding of information is facilitated during low ACh conditions. Finally I propose that ACh levels control whether a network is in one of three functional states: (High ACh; Active waking) optimized for encoding of new information or the stable representation of relevant memories, (Mid ACh; resting state or REM) optimized for encoding connections between currently stored memories or searching the catalog of stored memories, and (Low ACh; NREM) optimized for renormalization of synaptic strength and memory consolidation. This work provides a mechanistic insight into the role of dynamic changes in ACh levels for the encoding, consolidation, and maintenance of memories within the brain.PHDNeuroscienceUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/147503/1/roachjp_1.pd

    Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity

    Get PDF
    Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity pattern is hippocampal replay, which is critical for memory consolidation. The switchings between replay events and a low-activity state in neural recordings suggests metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to spike noise and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesosocpic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie ("chemical Langevin equation"), which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down states dynamics) by means of phase-plane analysis. This stochastic neural mass model is the basic component of our mesoscopic model for replay. We show that our model faithfully captures the stochastic nature of individual replayed trajectories. Moreover, compared to the deterministic Romani-Tsodyks model of place cell dynamics, it exhibits a higher level of variability in terms of content, direction and timing of replay events, compatible with biological evidence and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.Comment: 43 pages, 8 figure

    Space, time and memory in the medial temporal lobe

    Get PDF
    This thesis focuses on memory and the representation of space in the medial temporal lobe, their interaction and their temporal structure. Chapter 1 briefly introduces the topic, with emphasis on the open questions that the subsequent chapters aim to address. Chapter 2 is dedicated to the issue of spatial memory in the medial entorhinal cortex. It investigates the possibility to store multiple independent maps in a recurrent network of grid cells, from a theoretical perspective. This work was conducted in collaboration with Remi Monasson, Alexis Dubreuil and Sophie Rosay and is published in (Spalla et al. 2019). Chapter 3 focuses on the problem of the dynamical update of the representation of space during navigation. It presents the results of the analysis of electrophysiological data, previously collected by Charlotte Boccara (Boccara et al., 2010), investigating the encoding of self-movement signals (speed and angular velocity of the head) in the parahippocampal region of rats. Chapter 4 addresses the problem of the temporal dynamics of memory retrieval, again from a computational point of view. A continuous attractor network model is presented, endowed with a mechanism that makes it able to retrieve continuous temporal sequences. The dynamical behaviour of the system is investigated with analytical calculations and numerical simulations, and the storage capacity for dynamical memories is computed. Finally, chapter 4 discusses the meaning and the scope of the results presented, and highlights possible future directions

    Dual coding with STDP in a spiking recurrent neural network model of the hippocampus.

    Get PDF
    The firing rate of single neurons in the mammalian hippocampus has been demonstrated to encode for a range of spatial and non-spatial stimuli. It has also been demonstrated that phase of firing, with respect to the theta oscillation that dominates the hippocampal EEG during stereotype learning behaviour, correlates with an animal's spatial location. These findings have led to the hypothesis that the hippocampus operates using a dual (rate and temporal) coding system. To investigate the phenomenon of dual coding in the hippocampus, we examine a spiking recurrent network model with theta coded neural dynamics and an STDP rule that mediates rate-coded Hebbian learning when pre- and post-synaptic firing is stochastic. We demonstrate that this plasticity rule can generate both symmetric and asymmetric connections between neurons that fire at concurrent or successive theta phase, respectively, and subsequently produce both pattern completion and sequence prediction from partial cues. This unifies previously disparate auto- and hetero-associative network models of hippocampal function and provides them with a firmer basis in modern neurobiology. Furthermore, the encoding and reactivation of activity in mutually exciting Hebbian cell assemblies demonstrated here is believed to represent a fundamental mechanism of cognitive processing in the brain
    • …
    corecore