6 research outputs found

    Omission responses in local field potentials in rat auditory cortex

    Get PDF
    Background Non-invasive recordings of gross neural activity in humans often show responses to omitted stimuli in steady trains of identical stimuli. This has been taken as evidence for the neural coding of prediction or prediction error. However, evidence for such omission responses from invasive recordings of cellular-scale responses in animal models is scarce. Here, we sought to characterise omission responses using extracellular recordings in the auditory cortex of anaesthetised rats. We profiled omission responses across local field potentials (LFP), analogue multiunit activity (AMUA), and single/multi-unit spiking activity, using stimuli that were fixed-rate trains of acoustic noise bursts where 5% of bursts were randomly omitted. Results Significant omission responses were observed in LFP and AMUA signals, but not in spiking activity. These omission responses had a lower amplitude and longer latency than burst-evoked sensory responses, and omission response amplitude increased as a function of the number of preceding bursts. Conclusions Together, our findings show that omission responses are most robustly observed in LFP and AMUA signals (relative to spiking activity). This has implications for models of cortical processing that require many neurons to encode prediction errors in their spike output

    Temporal regularity in audition

    No full text
    Sound, by its very nature, is a temporal phenomenon. Everything from the perception of pitch to the sense of closure at the end of a symphony, relies on the brain's ability to integrate information over time. Ultimately it is perception that enables the richness of our interactions with the world around us, and it is a remarkable feat that the brain can quickly and accurately sift through the flood of information entering the ears to construct a coherent yet dynamic internal representation of the external world. Underlying this feat in part is the brain's ability to rapidly detect temporal patterns over timescales that span orders of magnitude, from sub-milliseconds to tens of seconds. Of particular interest is the ability to detect rhythms, or sound patterns in the range of hundreds of milliseconds to seconds. This timescale is particularly fascinating because it is critical to the perception of rhythm and beat in music, an ability that comes surprisingly naturally to us despite its neural underpinnings being far from understood. It is not just musical beat perception that remains mysterious; even the brain's mechanisms for detecting simple temporal patterns at this timescale are still not known. The work in this thesis therefore explores two broad questions that are key to understanding temporal processing at the rhythm timescale: 1- what are the perceptual consequences of rhythmic temporal regularity in sound? and 2- how does the brain detect temporal regularities at the rhythm timescale? I combine human psychoacoustics, rodent electrophysiology, and computational modelling to demonstrate that rhythmic sound patterns are easier to detect than arrhythmic ones, that adaptation in the auditory system may be a mechanism by which information about temporal structure at the rhythm timescale is encoded and made available to higher structures, and that low-level auditory processing may play a substantial part in shaping complex percepts such as where and how clearly we feel the beat in music.</p

    Temporal regularity in audition

    No full text
    Sound, by its very nature, is a temporal phenomenon. Everything from the perception of pitch to the sense of closure at the end of a symphony, relies on the brain's ability to integrate information over time. Ultimately it is perception that enables the richness of our interactions with the world around us, and it is a remarkable feat that the brain can quickly and accurately sift through the flood of information entering the ears to construct a coherent yet dynamic internal representation of the external world. Underlying this feat in part is the brain's ability to rapidly detect temporal patterns over timescales that span orders of magnitude, from sub-milliseconds to tens of seconds. Of particular interest is the ability to detect rhythms, or sound patterns in the range of hundreds of milliseconds to seconds. This timescale is particularly fascinating because it is critical to the perception of rhythm and beat in music, an ability that comes surprisingly naturally to us despite its neural underpinnings being far from understood. It is not just musical beat perception that remains mysterious; even the brain's mechanisms for detecting simple temporal patterns at this timescale are still not known. The work in this thesis therefore explores two broad questions that are key to understanding temporal processing at the rhythm timescale: 1- what are the perceptual consequences of rhythmic temporal regularity in sound? and 2- how does the brain detect temporal regularities at the rhythm timescale? I combine human psychoacoustics, rodent electrophysiology, and computational modelling to demonstrate that rhythmic sound patterns are easier to detect than arrhythmic ones, that adaptation in the auditory system may be a mechanism by which information about temporal structure at the rhythm timescale is encoded and made available to higher structures, and that low-level auditory processing may play a substantial part in shaping complex percepts such as where and how clearly we feel the beat in music.</p

    Omission responses in local field potentials in rat auditory cortex

    No full text
    Abstract Background Non-invasive recordings of gross neural activity in humans often show responses to omitted stimuli in steady trains of identical stimuli. This has been taken as evidence for the neural coding of prediction or prediction error. However, evidence for such omission responses from invasive recordings of cellular-scale responses in animal models is scarce. Here, we sought to characterise omission responses using extracellular recordings in the auditory cortex of anaesthetised rats. We profiled omission responses across local field potentials (LFP), analogue multiunit activity (AMUA), and single/multi-unit spiking activity, using stimuli that were fixed-rate trains of acoustic noise bursts where 5% of bursts were randomly omitted. Results Significant omission responses were observed in LFP and AMUA signals, but not in spiking activity. These omission responses had a lower amplitude and longer latency than burst-evoked sensory responses, and omission response amplitude increased as a function of the number of preceding bursts. Conclusions Together, our findings show that omission responses are most robustly observed in LFP and AMUA signals (relative to spiking activity). This has implications for models of cortical processing that require many neurons to encode prediction errors in their spike output

    Data from: Midbrain adaptation may set the stage for the perception of musical beat

    No full text
    The ability to spontaneously feel a beat in music is a phenomenon widely believed to be unique to humans. Though beat perception involves the coordinated engagement of sensory, motor, and cognitive processes in humans, the contribution of low-level auditory processing to the activation of these networks in a beat-specific manner is poorly understood. Here, we present evidence from a rodent model that midbrain pre-processing of sounds may already be shaping where the beat is ultimately felt. For the tested set of musical rhythms, on-beat sounds on average evoked higher firing rates than off-beat sounds, and this difference was a defining feature of the set of beat interpretations most commonly perceived by human listeners over others. Basic firing rate adaptation provided a sufficient explanation for these results. Our findings suggest that midbrain adaptation, by encoding the temporal context of sounds, creates points of neural emphasis that may influence the perceptual emergence of a beat

    Stimuli + gerbil electrophysiology and human tapping data

    No full text
    This .zip file contains spike PSTH and LFP data from the gerbil IC and human tapping data in Matlab format. Audio files of the stimuli are also included. See README for full details
    corecore