9 research outputs found

    Homeostatic Plasticity Studied Using In Vivo Hippocampal Activity-Blockade: Synaptic Scaling, Intrinsic Plasticity and Age-Dependence

    Get PDF
    Homeostatic plasticity is thought to be important in preventing neuronal circuits from becoming hyper- or hypoactive. However, there is little information concerning homeostatic mechanisms following in vivo manipulations of activity levels. We investigated synaptic scaling and intrinsic plasticity in CA1 pyramidal cells following 2 days of activity-blockade in vivo in adult (postnatal day 30; P30) and juvenile (P15) rats. Chronic activity-blockade in vivo was achieved using the sustained release of the sodium channel blocker tetrodotoxin (TTX) from the plastic polymer Elvax 40W implanted directly above the hippocampus, followed by electrophysiological assessment in slices in vitro. Three sets of results were in general agreement with previous studies on homeostatic responses to in vitro manipulations of activity. First, Schaffer collateral stimulation-evoked field responses were enhanced after 2 days of in vivo TTX application. Second, miniature excitatory postsynaptic current (mEPSC) amplitudes were potentiated. However, the increase in mEPSC amplitudes occurred only in juveniles, and not in adults, indicating age-dependent effects. Third, intrinsic neuronal excitability increased. In contrast, three sets of results sharply differed from previous reports on homeostatic responses to in vitro manipulations of activity. First, miniature inhibitory postsynaptic current (mIPSC) amplitudes were invariably enhanced. Second, multiplicative scaling of mEPSC and mIPSC amplitudes was absent. Third, the frequencies of adult and juvenile mEPSCs and adult mIPSCs were increased, indicating presynaptic alterations. These results provide new insights into in vivo homeostatic plasticity mechanisms with relevance to memory storage, activity-dependent development and neurological diseases

    A simple spontaneously active Hebbian learning model: homeostasis of activity and connectivity, and consequences for learning and epileptogenesis

    Full text link
    A spontaneously active neural system that is capable of continual learning should also be capable of homeostasis of both firing rate and connectivity. Experimental evidence suggests that both types of homeostasis exist, and that connectivity is maintained at a state that is optimal for information transmission and storage. This state is referred to as the critical state. We present a simple stochastic computational Hebbian learning model that incorporates both firing rate and critical homeostasis, and we explore its stability and connectivity properties. We also examine the behavior of our model with a simulated seizure and with simulated acute deafferentation. We argue that a neural system that is more highly connected than the critical state (i.e., one that is "supercritical") is epileptogenic. Based on our simulations, we predict that the post-seizural and post-deafferentation states should be supercritical and epileptogenic. Furthermore, interventions that boost spontaneous activity should be protective against epileptogenesis.Comment: 37 pages, 1 table, 7 figure

    Spatio-Temporally Efficient Coding Assigns Functions to Hierarchical Structures of the Visual System

    Get PDF
    Hierarchical structures constitute a wide array of brain areas, including the visual system. One of the important questions regarding visual hierarchical structures is to identify computational principles for assigning functions that represent the external world to hierarchical structures of the visual system. Given that visual hierarchical structures contain both bottom-up and top-down pathways, the derived principles should encompass these bidirectional pathways. However, existing principles such as predictive coding do not provide an effective principle for bidirectional pathways. Therefore, we propose a novel computational principle for visual hierarchical structures as spatio-temporally efficient coding underscored by the efficient use of given resources in both neural activity space and processing time. This coding principle optimises bidirectional information transmissions over hierarchical structures by simultaneously minimising temporal differences in neural responses and maximising entropy in neural representations. Simulations demonstrated that the proposed spatio-temporally efficient coding was able to assign the function of appropriate neural representations of natural visual scenes to visual hierarchical structures. Furthermore, spatio-temporally efficient coding was able to predict well-known phenomena, including deviations in neural responses to unlearned inputs and bias in preferred orientations. Our proposed spatio-temporally efficient coding may facilitate deeper mechanistic understanding of the computational processes of hierarchical brain structures

    Spatio-temporally efficient coding: A computational principle of biological neural networks

    Get PDF
    Department of Biomedical Engineering (Human Factors Engineering)One of the major goals of neuroscience is to understand how the external world is represented in the brain. This is a neural coding problem: the coding from the external world to its neural representations. There are two different kinds of problems with neural coding. One is to study the types of neuronal activity that represent the external world. Representative examples here are rate coding and temporal coding. In this study, we will present the spike distance method that reads temporal coding-related information from neural data. Another is to study what principles make such neural representations possible. This is an approach to the computational principle and the main topic of the present study. The brain sensory system has hierarchical structures. It is important to find the principles assigning functions to the hierarchical structures. On the one hand, the hierarchical structures of the brain sensory system contain both bottom-up and top-down pathways. In this bidirectional hierarchical structure, two types of neuronal noise are generated. One of them is noise generated as neural information fluctuates across the hierarchy according to the initial condition of the neural response, even if the external sensory input is static. Another is noise, precisely error, caused by coding different information in each hierarchy because of the transmission delay of information when external sensory input is dynamic. Despite these noise problems, it seems that sensory information processing is performed without any major problems in the sensory system of the real brain. Therefore, a neural coding principle that can overcome these noise problems is neededHow can the brain overcome these noise problems? Efficient coding is one of representative neural coding principles, however, existing efficient coding does not take into account these noise problems. To treat these noise problems, as one of efficient coding principles, we devised spatio-temporal efficient coding, which was inspired by the efficient use of given space and time resources, to optimize bidirectional information transmission on the hierarchical structures. This optimization is to learn smooth neural responses on time domain. In simulations, we showed spatio-temporal efficient coding was able to solve above two noise problems. We expect that spatio-temporal efficient coding helps us to understand how the brain computes.ope

    Visual experience dependent control of NMDAR subunit composition and neuronal gene expression: a critical role for the GluN2A C-terminal domain

    Get PDF
    The N-methyl-D-aspartate (NMDAR) NMDA receptor is a Ca²⁺ -permeant glutamate receptor which plays key roles in health and disease. Canonical NMDARs are heterotetramers that contain two GluN2 subunits, of which GluN2A and GluN2B are predominant in the forebrain. Moreover, the relative contribution of GluN2A vs. GluN2B is controlled both developmentally and in an activity-dependent manner. The GluN2 subtype influences the biophysical properties of the receptor through differences in their N-terminal extracellular domain and transmembrane regions, but they also have large cytoplasmic Carboxyl (C)- terminal domains (CTDs) which have diverged substantially during evolution. While the CTD identity does not influence NMDAR subunit specific channel properties, it determines the nature of CTD-associated signalling molecules and has been implicated in mediating the control of subunit composition (2A vs. 2B) at the synapse. However, the role of CTD identity in mediating activity-dependent changes in NMDAR subunit composition remains unclear. First, I investigate the role of sensory experience to changes in the NMDAR GluN2A:GluN2B composition by using two different dark rearing protocols: 1) Mice were dark reared from birth to assess the contribution of sensory experience to the developmental increase in the ratio of 2A to 2B; 2) Mice raised in standard light conditions were taken during the third postnatal week and placed in the dark for 7-days with or without subsequent re-exposure to light to interrogate subunit specific regulation in response to changes in the level of synaptic activity. Here I show that, while dark rearing from birth has little effect on NMDAR subunit composition, 7 days of dark rearing produces a marked decrease in the ratio of 2A to 2B, which is reversed upon re-exposure to light. Crucially, I demonstrate that changes in the ratio are driven by changes in the level of synaptic GluN2A, and not GluN2B. This suggests that GluN2A is dynamically regulated by activity, and points to a GluN2A dependent mechanism of insertion/removal from the synapse. Historically, much of the research into the differential function of GluN2 CTDs has been conducted in vitro by over-expressing mutant subunits, but more recently, the generation of knock-in (KI) mouse models have allowed CTD function to be probed in vivo and in ex vivo systems without heterologous expression of GluN2 mutants. Taking advantage of a KI mouse model with the GluN2A CTD (CTD²ᴬ) swapped for that of GluN2B (CTD²ᴮ) (GluN2A²ᴮ⁽ᶜᵀᴿ⁾/²ᴮ⁽ᶜᵀᴿ⁾), I next investigate the role of the CTD²ᴬ in experience-dependent changes in the level of synaptic GluN2A. I demonstrate that the CTD²ᴬ is required for activitydependent synaptic expression of GluN2A. Furthermore, I find that the transcriptomic response to changes in sensory experience is blunted in the absence of the CTD²ᴬ. Collectively, this work establishes an important role for CTD²ᴬ in driving activity-dependent changes to both NMDAR subunit composition and gene transcription

    2006 Special Issue Homeostatic synaptic scaling in self-organizing maps

    No full text
    www.elsevier.com/locate/neunet Various forms of the self-organizing map (SOM) have been proposed as models of cortical development [Choe Y., Miikkulainen R., (2004). Contour integration and segmentation with self-organized lateral connections. Biological Cybernetics, 90, 75–88; Kohonen T., (2001). Selforganizing maps (3rd ed.). Springer; Sirosh J., Miikkulainen R., (1997). Topographic receptive fields and patterned lateral interaction in a self-organizing model of the primary visual cortex. Neural Computation, 9(3), 577–594]. Typically, these models use weight normalization to contain the weight growth associated with Hebbian learning. A more plausible mechanism for controlling the Hebbian process has recently emerged. Turrigiano and Nelson [Turrigiano G.G., Nelson S.B., (2004). Homeostatic plasticity in the developing nervous system. Nature Reviews Neuroscience, 5, 97–107] have shown that neurons in the cortex actively maintain an average firing rate by scaling their incoming weights. In this work, it is shown that this type of homeostatic synaptic scaling can replace the common, but unsupported, standard weight normalization. Organized maps still form and the output neurons are able to maintain an unsaturated firing rate, even in the face of large-scale cell proliferation or die-off. In addition, it is shown that in some cases synaptic scaling leads to networks that more accurately reflect the probability distribution of the input data. c ○ 2006 Elsevier Ltd. All rights reserved. Keywords: Self-organizing map; Homeostasis; Weight normalization; Synaptic scalin

    Neural Networks 2006 Special Issue "Advances in Self-Organizing Maps-WSOM 05"

    No full text
    Special issue of Neural Networks Journal after the WSOM 05 ConferenceSpecial issue of Neural Networks Journal after the WSOM 05 ConferenceNeural Networks Volume 19, Issues 6-7, Pages 721-976 (July-August 2006) Advances in Self Organising Maps - WSOM'05 Edited by Marie Cottrell and Michel Verleysen 1. Advances in Self-Organizing Maps Pages 721-722 Marie Cottrell and Michel Verleysen 2. Self-organizing neural projections Pages 723-733 Teuvo Kohonen 3. Homeostatic synaptic scaling in self-organizing maps Pages 734-743 Thomas J. Sullivan and Virginia R. de Sa 4. Topographic map formation of factorized Edgeworth-expanded kernels Pages 744-750 Marc M. Van Hulle 5. Large-scale data exploration with the hierarchically growing hyperbolic SOM Pages 751-761 Jörg Ontrup and Helge Ritter 6. Batch and median neural gas Pages 762-771 Marie Cottrell, Barbara Hammer, Alexander Hasenfuß and Thomas Villmann 7. Fuzzy classification by fuzzy labeled neural gas Pages 772-779 Th. Villmann, B. Hammer, F. Schleif, T. Geweniger and W. Herrmann 8. On the equivalence between kernel self-organising maps and self-organising mixture density networks Pages 780-784 Hujun Yin 9. Adaptive filtering with the self-organizing map: A performance comparison Pages 785-798 Guilherme A. Barreto and Luís Gustavo M. Souza 10. The Self-Organizing Relationship (SOR) network employing fuzzy inference based heuristic evaluation Pages 799-811 Takanori Koga, Keiichi Horio and Takeshi Yamakawa 11. SOM's mathematics Pages 812-816 J.C. Fort 12. Performance analysis of LVQ algorithms: A statistical physics approach Pages 817-829 Anarta Ghosh, Michael Biehl and Barbara Hammer 13. Self-organizing map algorithm and distortion measure Pages 830-837 Joseph Rynkiewicz 14. Understanding and reducing variability of SOM neighbourhood structure Pages 838-846 Patrick Rousset, Christiane Guinot and Bertrand Maillet 15. Assessing self organizing maps via contiguity analysis Pages 847-854 Ludovic Lebart 16. Fast algorithm and implementation of dissimilarity self-organizing maps Pages 855-863 Brieuc Conan-Guez, Fabrice Rossi and Aïcha El Golli 17. Graph-based normalization and whitening for non-linear data analysis Pages 864-876 Catherine Aaron 18. Unfolding preprocessing for meaningful time series clustering Pages 877-888 Geoffroy Simon, John A. Lee and Michel Verleysen 19. Local multidimensional scaling Pages 889-899 Jarkko Venna and Samuel Kaski 20. Spherical self-organizing map using efficient indexed geodesic data structure Pages 900-910 Yingxin Wu and Masahiro Takatsuka 21. Advanced visualization of Self-Organizing Maps with vector fields Pages 911-922 Georg Pölzlbauer, Michael Dittenbach and Andreas Rauber 22. Online data visualization using the neural gas network Pages 923-934 Pablo A. Estévez and Cristián J. Figueroa 23. TreeSOM: Cluster analysis in the self-organizing map Pages 935-949 Elena V. Samsonova, Joost N. Kok and Ad P. IJzerman 24. Self-organizing neural networks to support the discovery of DNA-binding motifs Pages 950-962 Shaun Mahony, Panayiotis V. Benos, Terry J. Smith and Aaron Golden 25. A descriptive method to evaluate the number of regimes in a switching autoregressive model Pages 963-972 Madalina Oltean
    corecore