2,780 research outputs found

    Plasticity of GABA(B) receptor-mediated heterosynaptic interactions at mossy fibers after status epilepticus

    Get PDF
    Several neurotransmitters, including GABA acting at presynaptic GABAB receptors, modulate glutamate release at synapses between hippocampal mossy fibers and CA3 pyramidal neurons. This phenomenon gates excitation of the hippocampus and may therefore prevent limbic seizure propagation. Here we report that status epilepticus, triggered by either perforant path stimulation or pilocarpine administration, was followed 24 hr later by a loss of GABAB receptor-mediated heterosynaptic depression among populations of mossy fibers. This was accompanied by a decrease in the sensitivity of mossy fiber transmission to the exogenous GABAB receptor agonist baclofen. Autoradiography revealed a reduction in GABAB receptor binding in the stratum lucidum after status epilepticus. Failure of GABAB receptor-mediated modulation of mossy fiber transmission at mossy fibers may contribute to the development of spontaneous seizures after status epilepticus

    A Heterosynaptic Learning Rule for Neural Networks

    Full text link
    In this article we intoduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre- and postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean learning time increases with the number of patterns to be learned polynomially, indicating efficient learning.Comment: 19 page

    The N-methyl-d-aspartate receptor antagonist CPP alters synapse and spine structure and impairs long-term potentiation and long-term depression induced morphological plasticity in dentate gyrus of the awake rat

    Get PDF
    Long-term morphological synaptic changes associated with homosynaptic long-term potentiation (LTP) and heterosynaptic long-term depression (LTD) in vivo, in awake adult rats were analyzed using three-dimensional (3-D) reconstructions of electron microscope images of ultrathin serial sections from the molecular layer of the dentate gyrus. For the first time in morphological studies, the specificity of the effects of LTP and LTD on both spine and synapse ultrastructure was determined using an N-methyl-d-aspartate (NMDA) receptor antagonist CPP (3-[(R)-2-carboxypiperazin-4-yl]-propyl-1-phosphonic acid). There were no differences in synaptic density 24 h after LTP or LTD induction, and CPP alone had no effect on synaptic density. LTP increased significantly the proportion of mushroom spines, whereas LTD increased the proportion of thin spines, and both LTP and LTD decreased stubby spine number. Both LTP and LTD increased significantly spine head evaginations (spinules) into synaptic boutons and CPP blocked these changes. Synaptic boutons were smaller after LTD, indicating a pre-synaptic effect. Interestingly, CPP alone decreased bouton and mushroom spine volumes, as well as post-synaptic density (PSD) volume of mushroom spines.These data show similarities, but also some clear differences, between the effects of LTP and LTD on spine and synaptic morphology. Although CPP blocks both LTP and LTD, and impairs most morphological changes in spines and synapses, CPP alone was shown to exert effects on aspects of spine and synaptic structure

    Synaptic tagging and capture : differential role of distinct calcium/calmodulin kinases in protein synthesis-dependent long-term potentiation

    Get PDF
    Weakly tetanized synapses in area CA1 of the hippocampus that ordinarily display long-term potentiation lasting ~3 h (called early-LTP) will maintain a longer-lasting change in efficacy (late-LTP) if the weak tetanization occurs shortly before or after strong tetanization of an independent, but convergent, set of synapses in CA1. The synaptic tagging and capture hypothesis explains this heterosynaptic influence on persistence in terms of a distinction between local mechanisms of synaptic tagging and cell-wide mechanisms responsible for the synthesis, distribution, and capture of plasticity-related proteins (PRPs). We now present evidence that distinct CaM kinase (CaMK) pathways serve a dissociable role in these mechanisms. Using a hippocampal brain-slice preparation that permits stable long-term recordings in vitro for >10 h and using hippocampal cultures to validate the differential drug effects on distinct CaMK pathways, we show that tag setting is blocked by the CaMK inhibitor KN-93 (2-[N-(2-hydroxyethyl)]-N-(4-methoxybenzenesulfonyl)amino-N-(4-chlorocinnamyl)-N-methylbenzylamine) that, at low concentration, is more selective for CaMKII. In contrast, the CaMK kinase inhibitor STO-609 [7H-benzimidazo(2,1-a)benz(de)isoquinoline-7-one-3-carboxylic acid] specifically limits the synthesis and/or availability of PRPs. Analytically powerful three-pathway protocols using sequential strong and weak tetanization in varying orders and test stimulation over long periods of time after LTP induction enable a pharmacological dissociation of these distinct roles of the CaMK pathways in late-LTP and so provide a novel framework for the molecular mechanisms by which synaptic potentiation, and possibly memories, become stabilized

    A novel plasticity rule can explain the development of sensorimotor intelligence

    Full text link
    Grounding autonomous behavior in the nervous system is a fundamental challenge for neuroscience. In particular, the self-organized behavioral development provides more questions than answers. Are there special functional units for curiosity, motivation, and creativity? This paper argues that these features can be grounded in synaptic plasticity itself, without requiring any higher level constructs. We propose differential extrinsic plasticity (DEP) as a new synaptic rule for self-learning systems and apply it to a number of complex robotic systems as a test case. Without specifying any purpose or goal, seemingly purposeful and adaptive behavior is developed, displaying a certain level of sensorimotor intelligence. These surprising results require no system specific modifications of the DEP rule but arise rather from the underlying mechanism of spontaneous symmetry breaking due to the tight brain-body-environment coupling. The new synaptic rule is biologically plausible and it would be an interesting target for a neurobiolocal investigation. We also argue that this neuronal mechanism may have been a catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video

    A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

    Get PDF
    We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest
    corecore