12,411 research outputs found

    Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans

    Get PDF
    Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception

    Dynamic Construction of Stimulus Values in the Ventromedial Prefrontal Cortex

    Get PDF
    Signals representing the value assigned to stimuli at the time of choice have been repeatedly observed in ventromedial prefrontal cortex (vmPFC). Yet it remains unknown how these value representations are computed from sensory and memory representations in more posterior brain regions. We used electroencephalography (EEG) while subjects evaluated appetitive and aversive food items to study how event-related responses modulated by stimulus value evolve over time. We found that value-related activity shifted from posterior to anterior, and from parietal to central to frontal sensors, across three major time windows after stimulus onset: 150–250 ms, 400–550 ms, and 700–800 ms. Exploratory localization of the EEG signal revealed a shifting network of activity moving from sensory and memory structures to areas associated with value coding, with stimulus value activity localized to vmPFC only from 400 ms onwards. Consistent with these results, functional connectivity analyses also showed a causal flow of information from temporal cortex to vmPFC. Thus, although value signals are present as early as 150 ms after stimulus onset, the value signals in vmPFC appear relatively late in the choice process, and seem to reflect the integration of incoming information from sensory and memory related regions

    Top-down effects on early visual processing in humans: a predictive coding framework

    Get PDF
    An increasing number of human electroencephalography (EEG) studies examining the earliest component of the visual evoked potential, the so-called C1, have cast doubts on the previously prevalent notion that this component is impermeable to top-down effects. This article reviews the original studies that (i) described the C1, (ii) linked it to primary visual cortex (V1) activity, and (iii) suggested that its electrophysiological characteristics are exclusively determined by low-level stimulus attributes, particularly the spatial position of the stimulus within the visual field. We then describe conflicting evidence from animal studies and human neuroimaging experiments and provide an overview of recent EEG and magnetoencephalography (MEG) work showing that initial V1 activity in humans may be strongly modulated by higher-level cognitive factors. Finally, we formulate a theoretical framework for understanding top-down effects on early visual processing in terms of predictive coding

    Human scalp potentials reflect a mixture of decision-related signals during perceptual choices

    Get PDF
    Single-unit animal studies have consistently reported decision-related activity mirroring a process of temporal accumulation of sensory evidence to a fixed internal decision boundary. To date, our understanding of how response patterns seen in single-unit data manifest themselves at the macroscopic level of brain activity obtained from human neuroimaging data remains limited. Here, we use single-trial analysis of human electroencephalography data to show that population responses on the scalp can capture choice-predictive activity that builds up gradually over time with a rate proportional to the amount of sensory evidence, consistent with the properties of a drift-diffusion-like process as characterized by computational modeling. Interestingly, at time of choice, scalp potentials continue to appear parametrically modulated by the amount of sensory evidence rather than converging to a fixed decision boundary as predicted by our model. We show that trial-to-trial fluctuations in these response-locked signals exert independent leverage on behavior compared with the rate of evidence accumulation earlier in the trial. These results suggest that in addition to accumulator signals, population responses on the scalp reflect the influence of other decision-related signals that continue to covary with the amount of evidence at time of choice

    Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy

    Get PDF
    Previous fMRI studies have reported mixed evidence for the influence of selective attention on amygdala responses to emotional stimuli, with some studies showing "automatic" emotional effects to threat-related stimuli without attention (or even without awareness), but other studies showing a gating of amygdala activity by selective attention with no response to unattended stimuli. We recorded intracranial local field potentials from the intact left lateral amygdala in a human patient prior to surgery for epilepsy and tested, with a millisecond time resolution, for neural responses to fearful faces appearing at either task-relevant or task-irrelevant locations. Our results revealed an early emotional effect in the amygdala arising prior to, and independently of, attentional modulation. However, at a later latency, we found a significant modulation of the differential emotional response when attention was directed toward or away from fearful faces. These results suggest separate influences of emotion and attention on amygdala activation and may help reconcile previous discrepancies concerning the relative responsiveness of the human amygdala to emotional and attentional factors

    Temporal characteristics of the influence of punishment on perceptual decision making in the human brain

    Get PDF
    Perceptual decision making is the process by which information from sensory systems is combined and used to influence our behavior. In addition to the sensory input, this process can be affected by other factors, such as reward and punishment for correct and incorrect responses. To investigate the temporal dynamics of how monetary punishment influences perceptual decision making in humans, we collected electroencephalography (EEG) data during a perceptual categorization task whereby the punishment level for incorrect responses was parametrically manipulated across blocks of trials. Behaviorally, we observed improved accuracy for high relative to low punishment levels. Using multivariate linear discriminant analysis of the EEG, we identified multiple punishment-induced discriminating components with spatially distinct scalp topographies. Compared with components related to sensory evidence, components discriminating punishment levels appeared later in the trial, suggesting that punishment affects primarily late postsensory, decision-related processing. Crucially, the amplitude of these punishment components across participants was predictive of the size of the behavioral improvements induced by punishment. Finally, trial-by-trial changes in prestimulus oscillatory activity in the alpha and gamma bands were good predictors of the amplitude of these components. We discuss these findings in the context of increased motivation/attention, resulting from increases in punishment, which in turn yields improved decision-related processing
    • …
    corecore