1,802 research outputs found

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA

    Network self-organization explains the statistics and dynamics of synaptic connection strengths in cortex

    Get PDF
    The information processing abilities of neural circuits arise from their synaptic connection patterns. Understanding the laws governing these connectivity patterns is essential for understanding brain function. The overall distribution of synaptic strengths of local excitatory connections in cortex and hippocampus is long-tailed, exhibiting a small number of synaptic connections of very large efficacy. At the same time, new synaptic connections are constantly being created and individual synaptic connection strengths show substantial fluctuations across time. It remains unclear through what mechanisms these properties of neural circuits arise and how they contribute to learning and memory. In this study we show that fundamental characteristics of excitatory synaptic connections in cortex and hippocampus can be explained as a consequence of self-organization in a recurrent network combining spike-timing-dependent plasticity (STDP), structural plasticity and different forms of homeostatic plasticity. In the network, associative synaptic plasticity in the form of STDP induces a rich-get-richer dynamics among synapses, while homeostatic mechanisms induce competition. Under distinctly different initial conditions, the ensuing self-organization produces long-tailed synaptic strength distributions matching experimental findings. We show that this self-organization can take place with a purely additive STDP mechanism and that multiplicative weight dynamics emerge as a consequence of network interactions. The observed patterns of fluctuation of synaptic strengths, including elimination and generation of synaptic connections and long-term persistence of strong connections, are consistent with the dynamics of dendritic spines found in rat hippocampus. Beyond this, the model predicts an approximately power-law scaling of the lifetimes of newly established synaptic connection strengths during development. Our results suggest that the combined action of multiple forms of neuronal plasticity plays an essential role in the formation and maintenance of cortical circuits

    Regulation of circuit organization and function through inhibitory synaptic plasticity

    Get PDF
    Diverse inhibitory neurons in the mammalian brain shape circuit connectivity and dynamics through mechanisms of synaptic plasticity. Inhibitory plasticity can establish excitation/inhibition (E/I) balance, control neuronal firing, and affect local calcium concentration, hence regulating neuronal activity at the network, single neuron, and dendritic level. Computational models can synthesize multiple experimental results and provide insight into how inhibitory plasticity controls circuit dynamics and sculpts connectivity by identifying phenomenological learning rules amenable to mathematical analysis. We highlight recent studies on the role of inhibitory plasticity in modulating excitatory plasticity, forming structured networks underlying memory formation and recall, and implementing adaptive phenomena and novelty detection. We conclude with experimental and modeling progress on the role of interneuron-specific plasticity in circuit computation and context-dependent learning

    Presynaptic adenosine receptor-mediated regulation of diverse thalamocortical short-term plasticity in the mouse whisker pathway

    Get PDF
    Short-term synaptic plasticity (STP) sets the sensitivity of a synapse to incoming activity and determines the temporal patterns that it best transmits. In “driver” thalamocortical (TC) synaptic populations, STP is dominated by depression during stimulation from rest. However, during ongoing stimulation, lemniscal TC connections onto layer 4 neurons in mouse barrel cortex express variable STP. Each synapse responds to input trains with a distinct pattern of depression or facilitation around its mean steady-state response. As a result, in common with other synaptic populations, lemniscal TC synapses express diverse rather than uniform dynamics, allowing for a rich representation of temporally varying stimuli. Here, we show that this STP diversity is regulated presynaptically. Presynaptic adenosine receptors of the A1R type, but not kainate receptors (KARs), modulate STP behavior. Blocking the receptors does not eliminate diversity, indicating that diversity is related to heterogeneous expression of multiple mechanisms in the pathway from presynaptic calcium influx to neurotransmitter release

    Adaptive and Phase Selective Spike Timing Dependent Plasticity in Synaptically Coupled Neuronal Oscillators

    Get PDF
    We consider and analyze the influence of spike-timing dependent plasticity (STDP) on homeostatic states in synaptically coupled neuronal oscillators. In contrast to conventional models of STDP in which spike-timing affects weights of synaptic connections, we consider a model of STDP in which the time lags between pre- and/or post-synaptic spikes change internal state of pre- and/or post-synaptic neurons respectively. The analysis reveals that STDP processes of this type, modeled by a single ordinary differential equation, may ensure efficient, yet coarse, phase-locking of spikes in the system to a given reference phase. Precision of the phase locking, i.e. the amplitude of relative phase deviations from the reference, depends on the values of natural frequencies of oscillators and, additionally, on parameters of the STDP law. These deviations can be optimized by appropriate tuning of gains (i.e. sensitivity to spike-timing mismatches) of the STDP mechanism. However, as we demonstrate, such deviations can not be made arbitrarily small neither by mere tuning of STDP gains nor by adjusting synaptic weights. Thus if accurate phase-locking in the system is required then an additional tuning mechanism is generally needed. We found that adding a very simple adaptation dynamics in the form of slow fluctuations of the base line in the STDP mechanism enables accurate phase tuning in the system with arbitrary high precision. Adaptation operating at a slow time scale may be associated with extracellular matter such as matrix and glia. Thus the findings may suggest a possible role of the latter in regulating synaptic transmission in neuronal circuits
    corecore