1,069 research outputs found

    Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity

    Get PDF
    In the adult mammalian cortex, a small fraction of spines are created and eliminated every day, and the resultant synaptic connection structure is highly nonrandom, even in local circuits. However, it remains unknown whether a particular synaptic connection structure is functionally advantageous in local circuits, and why creation and elimination of synaptic connections is necessary in addition to rich synaptic weight plasticity. To answer these questions, we studied an inference task model through theoretical and numerical analyses. We demonstrate that a robustly beneficial network structure naturally emerges by combining Hebbian-type synaptic weight plasticity and wiring plasticity. Especially in a sparsely connected network, wiring plasticity achieves reliable computation by enabling efficient information transmission. Furthermore, the proposed rule reproduces experimental observed correlation between spine dynamics and task performance

    Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model

    Full text link
    We examine a previouly introduced attractor neural network model that explains the persistent activities of neurons in the anterior ventral temporal cortex of the brain. In this model, the coexistence of several attractors including correlated attractors was reported in the cases of finite and infinite loading. In this paper, by means of a statistical mechanical method, we study the statics and dynamics of the model in both finite and extensive loading, mainly focusing on the retrieval properties of the Hopfield and correlated attractors. In the extensive loading case, we derive the evolution equations by the dynamical replica theory. We found several characteristic temporal behaviours, both in the finite and extensive loading cases. The theoretical results were confirmed by numerical simulations.Comment: 12 pages, 7 figure

    The Impact of the State and the Market on Japan\u27s Land Problems

    Get PDF

    Detailed dendritic excitatory/inhibitory balance through heterosynaptic spike-timing-dependent plasticity

    Get PDF
    The balance between excitatory and inhibitory inputs is a key feature of cortical dynamics. Such a balance is arguably preserved in dendritic branches, yet its underlying mechanism and functional roles remain unknown. In this study, we developed computational models of heterosynaptic spike-timing-dependent plasticity (STDP) to show that the excitatory/inhibitory balance in dendritic branches is robustly achieved through heterosynaptic interactions between excitatory and inhibitory synapses. The model reproduces key features of experimental heterosynaptic STDP well, and provides analytical insights. Furthermore, heterosynaptic STDP explains how the maturation of inhibitory neurons modulates the selectivity of excitatory neurons for binocular matching in the critical period plasticity. The model also provides an alternative explanation for the potential mechanism underlying the somatic detailed balance that is commonly associated with inhibitory STDP. Our results propose heterosynaptic STDP as a critical factor in synaptic organization and the resultant dendritic computation

    Redundancy in synaptic connections enables neurons to learn optimally

    Get PDF
    Recent experimental studies suggest that, in cortical microcircuits of the mammalian brain, the majority of neuron-to-neuron connections are realized by multiple synapses. However, it is not known whether such redundant synaptic connections provide any functional benefit. Here, we show that redundant synaptic connections enable near-optimal learning in cooperation with synaptic rewiring. By constructing a simple dendritic neuron model, we demonstrate that with multisynaptic connections synaptic plasticity approximates a sample-based Bayesian filtering algorithm known as particle filtering, and wiring plasticity implements its resampling process. Extending the proposed framework to a detailed single-neuron model of perceptual learning in the primary visual cortex, we show that the model accounts for many experimental observations. In particular, the proposed model reproduces the dendritic position dependence of spike-timing-dependent plasticity and the functional synaptic organization on the dendritic tree based on the stimulus selectivity of presynaptic neurons. Our study provides a conceptual framework for synaptic plasticity and rewiring

    Interactive reservoir computing for chunking information streams

    Get PDF
    Chunking is the process by which frequently repeated segments of temporal inputs are concatenated into single units that are easy to process. Such a process is fundamental to time-series analysis in biological and artificial information processing systems. The brain efficiently acquires chunks from various information streams in an unsupervised manner; however, the underlying mechanisms of this process remain elusive. A widely-adopted statistical method for chunking consists of predicting frequently repeated contiguous elements in an input sequence based on unequal transition probabilities over sequence elements. However, recent experimental findings suggest that the brain is unlikely to adopt this method, as human subjects can chunk sequences with uniform transition probabilities. In this study, we propose a novel conceptual framework to overcome this limitation. In this process, neural networks learn to predict dynamical response patterns to sequence input rather than to directly learn transition patterns. Using a mutually supervising pair of reservoir computing modules, we demonstrate how this mechanism works in chunking sequences of letters or visual images with variable regularity and complexity. In addition, we demonstrate that background noise plays a crucial role in correctly learning chunks in this model. In particular, the model can successfully chunk sequences that conventional statistical approaches fail to chunk due to uniform transition probabilities. In addition, the neural responses of the model exhibit an interesting similarity to those of the basal ganglia observed after motor habit formation

    Symmetric sequence processing in a recurrent neural network model with a synchronous dynamics

    Full text link
    The synchronous dynamics and the stationary states of a recurrent attractor neural network model with competing synapses between symmetric sequence processing and Hebbian pattern reconstruction is studied in this work allowing for the presence of a self-interaction for each unit. Phase diagrams of stationary states are obtained exhibiting phases of retrieval, symmetric and period-two cyclic states as well as correlated and frozen-in states, in the absence of noise. The frozen-in states are destabilised by synaptic noise and well separated regions of correlated and cyclic states are obtained. Excitatory or inhibitory self-interactions yield enlarged phases of fixed-point or cyclic behaviour.Comment: Accepted for publication in Journal of Physics A: Mathematical and Theoretica
    • …
    corecore