82,514 research outputs found

    Network self-organization explains the statistics and dynamics of synaptic connection strengths in cortex

    Get PDF
    The information processing abilities of neural circuits arise from their synaptic connection patterns. Understanding the laws governing these connectivity patterns is essential for understanding brain function. The overall distribution of synaptic strengths of local excitatory connections in cortex and hippocampus is long-tailed, exhibiting a small number of synaptic connections of very large efficacy. At the same time, new synaptic connections are constantly being created and individual synaptic connection strengths show substantial fluctuations across time. It remains unclear through what mechanisms these properties of neural circuits arise and how they contribute to learning and memory. In this study we show that fundamental characteristics of excitatory synaptic connections in cortex and hippocampus can be explained as a consequence of self-organization in a recurrent network combining spike-timing-dependent plasticity (STDP), structural plasticity and different forms of homeostatic plasticity. In the network, associative synaptic plasticity in the form of STDP induces a rich-get-richer dynamics among synapses, while homeostatic mechanisms induce competition. Under distinctly different initial conditions, the ensuing self-organization produces long-tailed synaptic strength distributions matching experimental findings. We show that this self-organization can take place with a purely additive STDP mechanism and that multiplicative weight dynamics emerge as a consequence of network interactions. The observed patterns of fluctuation of synaptic strengths, including elimination and generation of synaptic connections and long-term persistence of strong connections, are consistent with the dynamics of dendritic spines found in rat hippocampus. Beyond this, the model predicts an approximately power-law scaling of the lifetimes of newly established synaptic connection strengths during development. Our results suggest that the combined action of multiple forms of neuronal plasticity plays an essential role in the formation and maintenance of cortical circuits

    The E3 ubiquitin ligase IDOL regulates synaptic ApoER2 levels and is important for plasticity and learning.

    Get PDF
    Neuronal ApoE receptors are linked to learning and memory, but the pathways governing their abundance, and the mechanisms by which they affect the function of neural circuits are incompletely understood. Here we demonstrate that the E3 ubiquitin ligase IDOL determines synaptic ApoER2 protein levels in response to neuronal activation and regulates dendritic spine morphogenesis and plasticity. IDOL-dependent changes in ApoER2 abundance modulate dendritic filopodia initiation and synapse maturation. Loss of IDOL in neurons results in constitutive overexpression of ApoER2 and is associated with impaired activity-dependent structural remodeling of spines and defective LTP in primary neuron cultures and hippocampal slices. IDOL-deficient mice show profound impairment in experience-dependent reorganization of synaptic circuits in the barrel cortex, as well as diminished spatial and associative learning. These results identify control of lipoprotein receptor abundance by IDOL as a post-transcriptional mechanism underlying the structural and functional plasticity of synapses and neural circuits

    The Autism Related Protein Contactin-Associated Protein-Like 2 (CNTNAP2) Stabilizes New Spines: An In Vivo Mouse Study.

    Get PDF
    The establishment and maintenance of neuronal circuits depends on tight regulation of synaptic contacts. We hypothesized that CNTNAP2, a protein associated with autism, would play a key role in this process. Indeed, we found that new dendritic spines in mice lacking CNTNAP2 were formed at normal rates, but failed to stabilize. Notably, rates of spine elimination were unaltered, suggesting a specific role for CNTNAP2 in stabilizing new synaptic circuitry

    A neuromorphic systems approach to in-memory computing with non-ideal memristive devices: From mitigation to exploitation

    Full text link
    Memristive devices represent a promising technology for building neuromorphic electronic systems. In addition to their compactness and non-volatility features, they are characterized by computationally relevant physical properties, such as state-dependence, non-linear conductance changes, and intrinsic variability in both their switching threshold and conductance values, that make them ideal devices for emulating the bio-physics of real synapses. In this paper we present a spiking neural network architecture that supports the use of memristive devices as synaptic elements, and propose mixed-signal analog-digital interfacing circuits which mitigate the effect of variability in their conductance values and exploit their variability in the switching threshold, for implementing stochastic learning. The effect of device variability is mitigated by using pairs of memristive devices configured in a complementary push-pull mechanism and interfaced to a current-mode normalizer circuit. The stochastic learning mechanism is obtained by mapping the desired change in synaptic weight into a corresponding switching probability that is derived from the intrinsic stochastic behavior of memristive devices. We demonstrate the features of the CMOS circuits and apply the architecture proposed to a standard neural network hand-written digit classification benchmark based on the MNIST data-set. We evaluate the performance of the approach proposed on this benchmark using behavioral-level spiking neural network simulation, showing both the effect of the reduction in conductance variability produced by the current-mode normalizer circuit, and the increase in performance as a function of the number of memristive devices used in each synapse.Comment: 13 pages, 12 figures, accepted for Faraday Discussion
    corecore