1,757 research outputs found
The spectro-contextual encoding and retrieval theory of episodic memory.
The spectral fingerprint hypothesis, which posits that different frequencies of oscillations underlie different cognitive operations, provides one account for how interactions between brain regions support perceptual and attentive processes (Siegel etal., 2012). Here, we explore and extend this idea to the domain of human episodic memory encoding and retrieval. Incorporating findings from the synaptic to cognitive levels of organization, we argue that spectrally precise cross-frequency coupling and phase-synchronization promote the formation of hippocampal-neocortical cell assemblies that form the basis for episodic memory. We suggest that both cell assembly firing patterns as well as the global pattern of brain oscillatory activity within hippocampal-neocortical networks represents the contents of a particular memory. Drawing upon the ideas of context reinstatement and multiple trace theory, we argue that memory retrieval is driven by internal and/or external factors which recreate these frequency-specific oscillatory patterns which occur during episodic encoding. These ideas are synthesized into a novel model of episodic memory (the spectro-contextual encoding and retrieval theory, or "SCERT") that provides several testable predictions for future research
Storage of phase-coded patterns via STDP in fully-connected and sparse network: a study of the network capacity
We study the storage and retrieval of phase-coded patterns as stable
dynamical attractors in recurrent neural networks, for both an analog and a
integrate-and-fire spiking model. The synaptic strength is determined by a
learning rule based on spike-time-dependent plasticity, with an asymmetric time
window depending on the relative timing between pre- and post-synaptic
activity. We store multiple patterns and study the network capacity.
For the analog model, we find that the network capacity scales linearly with
the network size, and that both capacity and the oscillation frequency of the
retrieval state depend on the asymmetry of the learning time window. In
addition to fully-connected networks, we study sparse networks, where each
neuron is connected only to a small number z << N of other neurons. Connections
can be short range, between neighboring neurons placed on a regular lattice, or
long range, between randomly chosen pairs of neurons. We find that a small
fraction of long range connections is able to amplify the capacity of the
network. This imply that a small-world-network topology is optimal, as a
compromise between the cost of long range connections and the capacity
increase.
Also in the spiking integrate and fire model the crucial result of storing
and retrieval of multiple phase-coded patterns is observed. The capacity of the
fully-connected spiking network is investigated, together with the relation
between oscillation frequency of retrieval state and window asymmetry
Attractor networks and memory replay of phase coded spike patterns
We analyse the storage and retrieval capacity in a recurrent neural network
of spiking integrate and fire neurons. In the model we distinguish between a
learning mode, during which the synaptic connections change according to a
Spike-Timing Dependent Plasticity (STDP) rule, and a recall mode, in which
connections strengths are no more plastic. Our findings show the ability of the
network to store and recall periodic phase coded patterns a small number of
neurons has been stimulated. The self sustained dynamics selectively gives an
oscillating spiking activity that matches one of the stored patterns, depending
on the initialization of the network.Comment: arXiv admin note: text overlap with arXiv:1210.678
Contrastive learning and neural oscillations
The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks
We study the collective dynamics of a Leaky Integrate and Fire network in
which precise relative phase relationship of spikes among neurons are stored,
as attractors of the dynamics, and selectively replayed at differentctime
scales. Using an STDP-based learning process, we store in the connectivity
several phase-coded spike patterns, and we find that, depending on the
excitability of the network, different working regimes are possible, with
transient or persistent replay activity induced by a brief signal. We introduce
an order parameter to evaluate the similarity between stored and recalled
phase-coded pattern, and measure the storage capacity. Modulation of spiking
thresholds during replay changes the frequency of the collective oscillation or
the number of spikes per cycle, keeping preserved the phases relationship. This
allows a coding scheme in which phase, rate and frequency are dissociable.
Robustness with respect to noise and heterogeneity of neurons parameters is
studied, showing that, since dynamics is a retrieval process, neurons preserve
stablecprecise phase relationship among units, keeping a unique frequency of
oscillation, even in noisy conditions and with heterogeneity of internal
parameters of the units
- …