169,253 research outputs found

    Modeling Maintenance of Long-Term Potentiation in Clustered Synapses, Long-Term Memory Without Bistability

    Get PDF
    Memories are stored, at least partly, as patterns of strong synapses. Given molecular turnover, how can synapses maintain strong for the years that memories can persist? Some models postulate that biochemical bistability maintains strong synapses. However, bistability should give a bimodal distribution of synaptic strength or weight, whereas current data show unimodal distributions for weights and for a correlated variable, dendritic spine volume. Bistability of single synapses has also never been empirically demonstrated. Thus it is important for models to simulate both unimodal distributions and long-term memory persistence. Here a model is developed that connects ongoing, competing processes of synaptic growth and weakening to stochastic processes of receptor insertion and removal in dendritic spines. The model simulates long-term (in excess of 1 yr) persistence of groups of strong synapses. A unimodal weight distribution results. For stability of this distribution it proved essential to incorporate resource competition between synapses organized into small clusters. With competition, these clusters are stable for years. These simulations concur with recent data to support the clustered plasticity hypothesis, which suggests clusters, rather than single synaptic contacts, may be a fundamental unit for storage of long-term memory. The model makes empirical predictions, and may provide a framework to investigate mechanisms maintaining the balance between synaptic plasticity and stability of memory.Comment: 17 pages, 5 figure

    Adenosine A1 receptor activation mediates the developmental shift at layer 5 pyramidal cell synapses and is a determinant of mature synaptic strength

    Get PDF
    During the first postnatal month glutamatergic synapses between layer 5 pyramidal cells in the rodent neocortex switch from an immature state exhibiting high probability of neurotransmitter release, large unitary amplitude and synaptic depression to a mature state with decreased probability of release, smaller unitary amplitude and synaptic facilitation. Using paired recordings, we demonstrate that the developmental shift in release probability at synapses between rat somatosensory layer 5 thick-tufted pyramidal cells is due to a higher and more heterogeneous activation of presynaptic adenosine A1 receptors. Immature synapses under control conditions exhibited distributions of CV, failure rate and release probability that were almost coincident with the A1 receptor blocked condition; however, mature synapses under control conditions exhibited much broader distributions that spanned those of both the A1 receptor agonised and antagonised conditions. Immature and mature synapses expressed A1 receptors with no observable difference in functional efficacy and therefore the heterogeneous A1 receptor activation seen in the mature neocortex is due to increased adenosine concentrations that vary between synapses. Given the central role demonstrated for A1 receptor activation in determining synaptic amplitude and the statistics of transmission between mature layer 5 pyramidal cells, the emplacement of adenosine sources and sinks near the synaptic terminal could constitute a novel form of long-term synaptic plasticity

    Storage Capacity of Extremely Diluted Hopfield Model

    Full text link
    The storage capacity of the extremely diluted Hopfield Model is studied by using Monte Carlo techniques. In this work, instead of diluting the synapses according to a given distribution, the dilution of the synapses is obtained systematically by retaining only the synapses with dominant contributions. It is observed that by using the prescribed dilution method the critical storage capacity of the system increases with decreasing number of synapses per neuron reaching almost the value obtained from mean-field calculations. It is also shown that the increase of the storage capacity of the diluted system depends on the storage capacity of the fully connected Hopfield Model and the fraction of the diluted synapses.Comment: Latex, 14 pages, 4 eps figure

    Rapid, learning-induced inhibitory synaptogenesis in murine barrel field

    Get PDF
    The structure of neurons changes during development and in response to injury or alteration in sensory experience. Changes occur in the number, shape, and dimensions of dendritic spines together with their synapses. However, precise data on these changes in response to learning are sparse. Here, we show using quantitative transmission electron microscopy that a simple form of learning involving mystacial vibrissae results in approximately 70% increase in the density of inhibitory synapses on spines of neurons located in layer IV barrels that represent the stimulated vibrissae. The spines contain one asymmetrical (excitatory) and one symmetrical (inhibitory) synapse (double-synapse spines), and their density increases threefold as a result of learning with no apparent change in the density of asymmetrical synapses. This effect seems to be specific for learning because pseudoconditioning (in which the conditioned and unconditioned stimuli are delivered at random) does not lead to the enhancement of symmetrical synapses but instead results in an upregulation of asymmetrical synapses on spines. Symmetrical synapses of cells located in barrels receiving the conditioned stimulus also show a greater concentration of GABA in their presynaptic terminals. These results indicate that the immediate effect of classical conditioning in the "conditioned" barrels is rapid, pronounced, and inhibitory

    A CMOS Spiking Neuron for Brain-Inspired Neural Networks with Resistive Synapses and In-Situ Learning

    Get PDF
    Nanoscale resistive memories are expected to fuel dense integration of electronic synapses for large-scale neuromorphic system. To realize such a brain-inspired computing chip, a compact CMOS spiking neuron that performs in-situ learning and computing while driving a large number of resistive synapses is desired. This work presents a novel leaky integrate-and-fire neuron design which implements the dual-mode operation of current integration and synaptic drive, with a single opamp and enables in-situ learning with crossbar resistive synapses. The proposed design was implemented in a 0.18 μ\mum CMOS technology. Measurements show neuron's ability to drive a thousand resistive synapses, and demonstrate an in-situ associative learning. The neuron circuit occupies a small area of 0.01 mm2^2 and has an energy-efficiency of 9.3 pJ//spike//synapse

    Spatial representation of temporal information through spike timing dependent plasticity

    Get PDF
    We suggest a mechanism based on spike time dependent plasticity (STDP) of synapses to store, retrieve and predict temporal sequences. The mechanism is demonstrated in a model system of simplified integrate-and-fire type neurons densely connected by STDP synapses. All synapses are modified according to the so-called normal STDP rule observed in various real biological synapses. After conditioning through repeated input of a limited number of of temporal sequences the system is able to complete the temporal sequence upon receiving the input of a fraction of them. This is an example of effective unsupervised learning in an biologically realistic system. We investigate the dependence of learning success on entrainment time, system size and presence of noise. Possible applications include learning of motor sequences, recognition and prediction of temporal sensory information in the visual as well as the auditory system and late processing in the olfactory system of insects.Comment: 13 pages, 14 figures, completely revised and augmented versio

    Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning

    Full text link
    It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connecting rate while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing the statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we re-derive macroscopic steady state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connecting rate of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2/π2/\pi due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4/π4/\pi. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and strongly suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.Comment: 27 pages, 14 figure
    corecore