1,070 research outputs found
Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity
In the adult mammalian cortex, a small fraction of spines are created and eliminated every day, and the resultant synaptic connection structure is highly nonrandom, even in local circuits. However, it remains unknown whether a particular synaptic connection structure is functionally advantageous in local circuits, and why creation and elimination of synaptic connections is necessary in addition to rich synaptic weight plasticity. To answer these questions, we studied an inference task model through theoretical and numerical analyses. We demonstrate that a robustly beneficial network structure naturally emerges by combining Hebbian-type synaptic weight plasticity and wiring plasticity. Especially in a sparsely connected network, wiring plasticity achieves reliable computation by enabling efficient information transmission. Furthermore, the proposed rule reproduces experimental observed correlation between spine dynamics and task performance
Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model
We examine a previouly introduced attractor neural network model that
explains the persistent activities of neurons in the anterior ventral temporal
cortex of the brain. In this model, the coexistence of several attractors
including correlated attractors was reported in the cases of finite and
infinite loading. In this paper, by means of a statistical mechanical method,
we study the statics and dynamics of the model in both finite and extensive
loading, mainly focusing on the retrieval properties of the Hopfield and
correlated attractors. In the extensive loading case, we derive the evolution
equations by the dynamical replica theory. We found several characteristic
temporal behaviours, both in the finite and extensive loading cases. The
theoretical results were confirmed by numerical simulations.Comment: 12 pages, 7 figure
Noncommutative generalized Gibbs ensemble in isolated integrable quantum systems
The generalized Gibbs ensemble (GGE), which involves multiple conserved
quantities other than the Hamiltonian, has served as the statistical-mechanical
description of the long-time behavior for several isolated integrable quantum
systems. The GGE may involve a noncommutative set of conserved quantities in
view of the maximum entropy principle, and show that the GGE thus generalized
(noncommutative GGE, NCGGE) gives a more qualitatively accurate description of
the long-time behaviors than that of the conventional GGE. Providing a clear
understanding of why the (NC)GGE well describes the long-time behaviors, we
construct, for noninteracting models, the exact NCGGE that describes the
long-time behaviors without an error even at finite system size. It is
noteworthy that the NCGGE involves nonlocal conserved quantities, which can be
necessary for describing long-time behaviors of local observables. We also give
some extensions of the NCGGE and demonstrate how accurately they describe the
long-time behaviors of few-body observables.Comment: 13 pages, 8 figure
Detailed dendritic excitatory/inhibitory balance through heterosynaptic spike-timing-dependent plasticity
The balance between excitatory and inhibitory inputs is a key feature of cortical dynamics. Such a balance is arguably preserved in dendritic branches, yet its underlying mechanism and functional roles remain unknown. In this study, we developed computational models of heterosynaptic spike-timing-dependent plasticity (STDP) to show that the excitatory/inhibitory balance in dendritic branches is robustly achieved through heterosynaptic interactions between excitatory and inhibitory synapses. The model reproduces key features of experimental heterosynaptic STDP well, and provides analytical insights. Furthermore, heterosynaptic STDP explains how the maturation of inhibitory neurons modulates the selectivity of excitatory neurons for binocular matching in the critical period plasticity. The model also provides an alternative explanation for the potential mechanism underlying the somatic detailed balance that is commonly associated with inhibitory STDP. Our results propose heterosynaptic STDP as a critical factor in synaptic organization and the resultant dendritic computation
Redundancy in synaptic connections enables neurons to learn optimally
Recent experimental studies suggest that, in cortical microcircuits of the mammalian brain, the majority of neuron-to-neuron connections are realized by multiple synapses. However, it is not known whether such redundant synaptic connections provide any functional benefit. Here, we show that redundant synaptic connections enable near-optimal learning in cooperation with synaptic rewiring. By constructing a simple dendritic neuron model, we demonstrate that with multisynaptic connections synaptic plasticity approximates a sample-based Bayesian filtering algorithm known as particle filtering, and wiring plasticity implements its resampling process. Extending the proposed framework to a detailed single-neuron model of perceptual learning in the primary visual cortex, we show that the model accounts for many experimental observations. In particular, the proposed model reproduces the dendritic position dependence of spike-timing-dependent plasticity and the functional synaptic organization on the dendritic tree based on the stimulus selectivity of presynaptic neurons. Our study provides a conceptual framework for synaptic plasticity and rewiring
Interactive reservoir computing for chunking information streams
Chunking is the process by which frequently repeated segments of temporal inputs are concatenated into single units that are easy to process. Such a process is fundamental to time-series analysis in biological and artificial information processing systems. The brain efficiently acquires chunks from various information streams in an unsupervised manner; however, the underlying mechanisms of this process remain elusive. A widely-adopted statistical method for chunking consists of predicting frequently repeated contiguous elements in an input sequence based on unequal transition probabilities over sequence elements. However, recent experimental findings suggest that the brain is unlikely to adopt this method, as human subjects can chunk sequences with uniform transition probabilities. In this study, we propose a novel conceptual framework to overcome this limitation. In this process, neural networks learn to predict dynamical response patterns to sequence input rather than to directly learn transition patterns. Using a mutually supervising pair of reservoir computing modules, we demonstrate how this mechanism works in chunking sequences of letters or visual images with variable regularity and complexity. In addition, we demonstrate that background noise plays a crucial role in correctly learning chunks in this model. In particular, the model can successfully chunk sequences that conventional statistical approaches fail to chunk due to uniform transition probabilities. In addition, the neural responses of the model exhibit an interesting similarity to those of the basal ganglia observed after motor habit formation
Symmetric sequence processing in a recurrent neural network model with a synchronous dynamics
The synchronous dynamics and the stationary states of a recurrent attractor
neural network model with competing synapses between symmetric sequence
processing and Hebbian pattern reconstruction is studied in this work allowing
for the presence of a self-interaction for each unit. Phase diagrams of
stationary states are obtained exhibiting phases of retrieval, symmetric and
period-two cyclic states as well as correlated and frozen-in states, in the
absence of noise. The frozen-in states are destabilised by synaptic noise and
well separated regions of correlated and cyclic states are obtained. Excitatory
or inhibitory self-interactions yield enlarged phases of fixed-point or cyclic
behaviour.Comment: Accepted for publication in Journal of Physics A: Mathematical and
Theoretica
- …