428 research outputs found

    Nonlinear Hebbian learning as a unifying principle in receptive field formation

    Get PDF
    The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common principle, namely Nonlinear Hebbian Learning. When Nonlinear Hebbian Learning is applied to natural images, receptive field shapes were strongly constrained by the input statistics and preprocessing, but exhibited only modest variation across different choices of nonlinearities in neuron models or synaptic plasticity rules. Neither overcompleteness nor sparse network activity are necessary for the development of localized receptive fields. The analysis of alternative sensory modalities such as auditory models or V2 development lead to the same conclusions. In all examples, receptive fields can be predicted a priori by reformulating an abstract model as nonlinear Hebbian learning. Thus nonlinear Hebbian learning and natural statistics can account for many aspects of receptive field formation across models and sensory modalities

    Selectivity and Metaplasticity in a Unified Calcium-Dependent Model

    Get PDF
    A unified, biophysically motivated Calcium-Dependent Learning model has been shown to account for various rate-based and spike time-dependent paradigms for inducing synaptic plasticity. Here, we investigate the properties of this model for a multi-synapse neuron that receives inputs with different spike-train statistics. In addition, we present a physiological form of metaplasticity, an activity-driven regulation mechanism, that is essential for the robustness of the model. A neuron thus implemented develops stable and selective receptive fields, given various input statistic

    Dual coding with STDP in a spiking recurrent neural network model of the hippocampus.

    Get PDF
    The firing rate of single neurons in the mammalian hippocampus has been demonstrated to encode for a range of spatial and non-spatial stimuli. It has also been demonstrated that phase of firing, with respect to the theta oscillation that dominates the hippocampal EEG during stereotype learning behaviour, correlates with an animal's spatial location. These findings have led to the hypothesis that the hippocampus operates using a dual (rate and temporal) coding system. To investigate the phenomenon of dual coding in the hippocampus, we examine a spiking recurrent network model with theta coded neural dynamics and an STDP rule that mediates rate-coded Hebbian learning when pre- and post-synaptic firing is stochastic. We demonstrate that this plasticity rule can generate both symmetric and asymmetric connections between neurons that fire at concurrent or successive theta phase, respectively, and subsequently produce both pattern completion and sequence prediction from partial cues. This unifies previously disparate auto- and hetero-associative network models of hippocampal function and provides them with a firmer basis in modern neurobiology. Furthermore, the encoding and reactivation of activity in mutually exciting Hebbian cell assemblies demonstrated here is believed to represent a fundamental mechanism of cognitive processing in the brain

    Analog Spiking Neuromorphic Circuits and Systems for Brain- and Nanotechnology-Inspired Cognitive Computing

    Get PDF
    Human society is now facing grand challenges to satisfy the growing demand for computing power, at the same time, sustain energy consumption. By the end of CMOS technology scaling, innovations are required to tackle the challenges in a radically different way. Inspired by the emerging understanding of the computing occurring in a brain and nanotechnology-enabled biological plausible synaptic plasticity, neuromorphic computing architectures are being investigated. Such a neuromorphic chip that combines CMOS analog spiking neurons and nanoscale resistive random-access memory (RRAM) using as electronics synapses can provide massive neural network parallelism, high density and online learning capability, and hence, paves the path towards a promising solution to future energy-efficient real-time computing systems. However, existing silicon neuron approaches are designed to faithfully reproduce biological neuron dynamics, and hence they are incompatible with the RRAM synapses, or require extensive peripheral circuitry to modulate a synapse, and are thus deficient in learning capability. As a result, they eliminate most of the density advantages gained by the adoption of nanoscale devices, and fail to realize a functional computing system. This dissertation describes novel hardware architectures and neuron circuit designs that synergistically assemble the fundamental and significant elements for brain-inspired computing. Versatile CMOS spiking neurons that combine integrate-and-fire, passive dense RRAM synapses drive capability, dynamic biasing for adaptive power consumption, in situ spike-timing dependent plasticity (STDP) and competitive learning in compact integrated circuit modules are presented. Real-world pattern learning and recognition tasks using the proposed architecture were demonstrated with circuit-level simulations. A test chip was implemented and fabricated to verify the proposed CMOS neuron and hardware architecture, and the subsequent chip measurement results successfully proved the idea. The work described in this dissertation realizes a key building block for large-scale integration of spiking neural network hardware, and then, serves as a step-stone for the building of next-generation energy-efficient brain-inspired cognitive computing systems

    Spike-timing dependent plasticity and the cognitive map

    Get PDF
    Since the discovery of place cells – single pyramidal neurons that encode spatial location – it has been hypothesized that the hippocampus may act as a cognitive map of known environments. This putative function has been extensively modeled using auto-associative networks, which utilize rate-coded synaptic plasticity rules in order to generate strong bi-directional connections between concurrently active place cells that encode for neighboring place fields. However, empirical studies using hippocampal cultures have demonstrated that the magnitude and direction of changes in synaptic strength can also be dictated by the relative timing of pre- and post-synaptic firing according to a spike-timing dependent plasticity (STDP) rule. Furthermore, electrophysiology studies have identified persistent “theta-coded” temporal correlations in place cell activity in vivo, characterized by phase precession of firing as the corresponding place field is traversed. It is not yet clear if STDP and theta-coded neural dynamics are compatible with cognitive map theory and previous rate-coded models of spatial learning in the hippocampus. Here, we demonstrate that an STDP rule based on empirical data obtained from the hippocampus can mediate rate-coded Hebbian learning when pre- and post-synaptic activity is stochastic and has no persistent sequence bias. We subsequently demonstrate that a spiking recurrent neural network that utilizes this STDP rule, alongside theta-coded neural activity, allows the rapid development of a cognitive map during directed or random exploration of an environment of overlapping place fields. Hence, we establish that STDP and phase precession are compatible with rate-coded models of cognitive map development

    Functional Brain Oscillations: How Oscillations Facilitate Information Representation and Code Memories

    Get PDF
    The overall aim of the modelling works within this thesis is to lend theoretical evidence to empirical findings from the brain oscillations literature. We therefore hope to solidify and expand the notion that precise spike timing through oscillatory mechanisms facilitates communication, learning, information processing and information representation within the brain. The primary hypothesis of this thesis is that it can be shown computationally that neural de-synchronisations can allow information content to emerge. We do this using two neural network models, the first of which shows how differential rates of neuronal firing can indicate when a single item is being actively represented. The second model expands this notion by creating a complimentary timing mechanism, thus enabling the emergence of qualitive temporal information when a pattern of items is being actively represented. The secondary hypothesis of this thesis is that it can be also be shown computationally that oscillations might play a functional role in learning. Both of the models presented within this thesis propose a sparsely coded and fast learning hippocampal region that engages in the binding of novel episodic information. The first model demonstrates how active cortical representations enable learning to occur in their hippocampal counterparts via a phase-dependent learning rule. The second model expands this notion, creating hierarchical temporal sequences to encode the relative temporal position of cortical representations. We demonstrate in both of these models, how cortical brain oscillations might provide a gating function to the representation of information, whilst complimentary hippocampal oscillations might provide distinct phasic reference points for learning

    Decorrelation of Odor Representations via Spike Timing-Dependent Plasticity

    Get PDF
    The non-topographical representation of odor quality space differentiates early olfactory representations from those in other sensory systems. Decorrelation among olfactory representations with respect to physical odorant similarities has been proposed to rely upon local feed-forward inhibitory circuits in the glomerular layer that decorrelate odor representations with respect to the intrinsically high-dimensional space of ligand–receptor potency relationships. A second stage of decorrelation is likely to be mediated by the circuitry of the olfactory bulb external plexiform layer. Computations in this layer, or in the analogous interneuronal network of the insect antennal lobe, are dependent on fast network oscillations that regulate the timing of mitral cell and projection neuron (MC/PN) action potentials; this suggests a largely spike timing-dependent metric for representing odor information, here proposed to be a precedence code. We first illustrate how the rate coding metric of the glomerular layer can be transformed into a spike precedence code in MC/PNs. We then show how this mechanism of representation, combined with spike timing-dependent plasticity at MC/PN output synapses, can progressively decorrelate high-dimensional, non-topographical odor representations in third-layer olfactory neurons. Reducing MC/PN oscillations abolishes the spike precedence code and blocks this progressive decorrelation, demonstrating the learning network's selectivity for these sparsely synchronized MC/PN spikes even in the presence of temporally disorganized background activity. Finally, we apply this model to odor representations derived from calcium imaging in the honeybee antennal lobe, and show how odor learning progressively decorrelates odor representations, and how the abolition of PN oscillations impairs odor discrimination
    corecore