911 research outputs found
Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity
In the adult mammalian cortex, a small fraction of spines are created and eliminated every day, and the resultant synaptic connection structure is highly nonrandom, even in local circuits. However, it remains unknown whether a particular synaptic connection structure is functionally advantageous in local circuits, and why creation and elimination of synaptic connections is necessary in addition to rich synaptic weight plasticity. To answer these questions, we studied an inference task model through theoretical and numerical analyses. We demonstrate that a robustly beneficial network structure naturally emerges by combining Hebbian-type synaptic weight plasticity and wiring plasticity. Especially in a sparsely connected network, wiring plasticity achieves reliable computation by enabling efficient information transmission. Furthermore, the proposed rule reproduces experimental observed correlation between spine dynamics and task performance
Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model
We examine a previouly introduced attractor neural network model that
explains the persistent activities of neurons in the anterior ventral temporal
cortex of the brain. In this model, the coexistence of several attractors
including correlated attractors was reported in the cases of finite and
infinite loading. In this paper, by means of a statistical mechanical method,
we study the statics and dynamics of the model in both finite and extensive
loading, mainly focusing on the retrieval properties of the Hopfield and
correlated attractors. In the extensive loading case, we derive the evolution
equations by the dynamical replica theory. We found several characteristic
temporal behaviours, both in the finite and extensive loading cases. The
theoretical results were confirmed by numerical simulations.Comment: 12 pages, 7 figure
Comparative Land Policy: The Role of the State and the Market in Determining the Use and Price of Land
A retrotransposon-inserted VvmybA1a allele has been spread among cultivars of Vitis vinifera but not North American or East Asian Vitis species
Research Note
Detailed dendritic excitatory/inhibitory balance through heterosynaptic spike-timing-dependent plasticity
The balance between excitatory and inhibitory inputs is a key feature of cortical dynamics. Such a balance is arguably preserved in dendritic branches, yet its underlying mechanism and functional roles remain unknown. In this study, we developed computational models of heterosynaptic spike-timing-dependent plasticity (STDP) to show that the excitatory/inhibitory balance in dendritic branches is robustly achieved through heterosynaptic interactions between excitatory and inhibitory synapses. The model reproduces key features of experimental heterosynaptic STDP well, and provides analytical insights. Furthermore, heterosynaptic STDP explains how the maturation of inhibitory neurons modulates the selectivity of excitatory neurons for binocular matching in the critical period plasticity. The model also provides an alternative explanation for the potential mechanism underlying the somatic detailed balance that is commonly associated with inhibitory STDP. Our results propose heterosynaptic STDP as a critical factor in synaptic organization and the resultant dendritic computation
Interactive reservoir computing for chunking information streams
Chunking is the process by which frequently repeated segments of temporal inputs are concatenated into single units that are easy to process. Such a process is fundamental to time-series analysis in biological and artificial information processing systems. The brain efficiently acquires chunks from various information streams in an unsupervised manner; however, the underlying mechanisms of this process remain elusive. A widely-adopted statistical method for chunking consists of predicting frequently repeated contiguous elements in an input sequence based on unequal transition probabilities over sequence elements. However, recent experimental findings suggest that the brain is unlikely to adopt this method, as human subjects can chunk sequences with uniform transition probabilities. In this study, we propose a novel conceptual framework to overcome this limitation. In this process, neural networks learn to predict dynamical response patterns to sequence input rather than to directly learn transition patterns. Using a mutually supervising pair of reservoir computing modules, we demonstrate how this mechanism works in chunking sequences of letters or visual images with variable regularity and complexity. In addition, we demonstrate that background noise plays a crucial role in correctly learning chunks in this model. In particular, the model can successfully chunk sequences that conventional statistical approaches fail to chunk due to uniform transition probabilities. In addition, the neural responses of the model exhibit an interesting similarity to those of the basal ganglia observed after motor habit formation
Degree of locking to network activity of neurons with similar movement tuning in the motor cortex of awake, behaving rats differs by layer
- …