892 research outputs found

    Goal-directed attention alters the tuning of object-based representations in extrastriate cortex

    Get PDF
    Humans survive in environments that contain a vast quantity and variety of visual information. All items of perceived visual information must be represented within a limited number of brain networks. The human brain requires mechanisms for selecting only a relevant fraction of perceived information for more in-depth processing, where neural representations of that information may be actively maintained and utilized for goal-directed behavior. Object-based attention is crucial for goal-directed behavior and yet remains poorly understood. Thus, in the study we investigate how neural representations of visual object information are guided by selective attention. The magnitude of activation in human extrastriate cortex has been shown to be modulated by attention; however, object-based attention is not likely to be fully explained by a localized gain mechanism. Thus, we measured information coded in spatially distributed patterns of brain activity with fMRI while human participants performed a task requiring selective processing of a relevant visual object category that differed across conditions. Using pattern classification and spatial correlation techniques, we found that the direction of selective attention is implemented as a shift in the tuning of object-based information representations within extrastriate cortex. In contrast, we found that representations within lateral prefrontal cortex (PFC) coded for the attention condition rather than the concrete representations of object category. In sum, our findings are consistent with a model of object-based selective attention in which representations coded within extrastriate cortex are tuned to favor the representation of goal-relevant information, guided by more abstract representations within lateral PFC

    Neural integration in body perception

    Get PDF
    The perception of other people is instrumental in guiding social interactions. For example, the appearance of the human body cues a wide range of inferences regarding sex, age, health, and personality, as well as emotional state and intentions, which influence social behavior. To date, most neuroscience research on body perception has aimed to characterize the functional contribution of segregated patches of cortex in the ventral visual stream. In light of the growing prominence of network architectures in neuroscience, the current article reviews neuroimaging studies that measure functional integration between different brain regions during body perception. The review demonstrates that body perception is not restricted to processing in the ventral visual stream but instead reflects a functional alliance between the ventral visual stream and extended neural systems associated with action perception, executive functions, and theory of mind. Overall, these findings demonstrate how body percepts are constructed through interactions in distributed brain networks and underscore that functional segregation and integration should be considered together when formulating neurocognitive theories of body perception. Insight from such an updated model of body perception generalizes to inform the organizational structure of social perception and cognition more generally and also informs disorders of body image, such as anorexia nervosa, which may rely on atypical integration of body-related information.ISSN:0898-929XISSN:1530-889

    Cathodal transcranial direct current stimulation of the extrastriate visual cortex modulates implicit anti-fat bias in male, but not female, participants.

    Get PDF
    Explicit negative attitudes towards obese individuals are well documented and seem to modulate the activity of perceptual areas, such as the Extrastriate Body Area (EBA) in the lateral occipito-temporal cortex, which is critical for body-shape perception. Nevertheless, it is still unclear whether EBA serves a role in implicit weight-stereotypical bias, thus reflecting stereotypical trait attribution on the basis of perceptual cues. Here, we used an Implicit Association Test (IAT) to investigate whether applying transcranial direct current stimulation (tDCS) over bilateral extrastriate visual cortex reduces pre-existing implicit weight stereotypical associations (i.e. "Bad" with Fat and "Good" with Slim, valence-IAT). Furthermore, an aesthetic-IAT, which focused on body-concepts related to aesthetic dimensions (i.e. "Ugly" and "Beauty"), was developed as a control condition. Anodal, cathodal, or sham tDCS (2 mA, 10min) over the right and left lateral occipito-temporal (extrastriate visual) cortex was administered to 13 female and 12 male participants, before performing the IATs. Results showed that cathodal stimulation over the left extrastriate visual cortex reduced weight-bias for the evaluative dimensions (Bad vs. Good) as compared to sham stimulation over the same hemisphere. Furthermore, the effect was specific for the polarity and hemisphere of stimulation. Importantly, tDCS affected the responses only in male participants, who presented a reliable weight-bias during sham condition, but not in female participants, who did not show reliable weight-bias at sham condition. The present results suggest that negative attitudes towards obese individuals may reflect neural signals from the extrastriate visual cortex

    Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans

    Get PDF
    Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception

    Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy

    Get PDF
    Previous fMRI studies have reported mixed evidence for the influence of selective attention on amygdala responses to emotional stimuli, with some studies showing "automatic" emotional effects to threat-related stimuli without attention (or even without awareness), but other studies showing a gating of amygdala activity by selective attention with no response to unattended stimuli. We recorded intracranial local field potentials from the intact left lateral amygdala in a human patient prior to surgery for epilepsy and tested, with a millisecond time resolution, for neural responses to fearful faces appearing at either task-relevant or task-irrelevant locations. Our results revealed an early emotional effect in the amygdala arising prior to, and independently of, attentional modulation. However, at a later latency, we found a significant modulation of the differential emotional response when attention was directed toward or away from fearful faces. These results suggest separate influences of emotion and attention on amygdala activation and may help reconcile previous discrepancies concerning the relative responsiveness of the human amygdala to emotional and attentional factors

    A unified coding strategy for processing faces and voices

    Get PDF
    Both faces and voices are rich in socially-relevant information, which humans are remarkably adept at extracting, including a person's identity, age, gender, affective state, personality, etc. Here, we review accumulating evidence from behavioral, neuropsychological, electrophysiological, and neuroimaging studies which suggest that the cognitive and neural processing mechanisms engaged by perceiving faces or voices are highly similar, despite the very different nature of their sensory input. The similarity between the two mechanisms likely facilitates the multi-modal integration of facial and vocal information during everyday social interactions. These findings emphasize a parsimonious principle of cerebral organization, where similar computational problems in different modalities are solved using similar solutions

    Bilateral engagement of the occipito-temporal cortex in response to dance kinematics in experts

    Get PDF
    Previous evidence has shown neuroplastic changes in brain anatomy and connectivity associated with the acquisition of professional visuomotor skills. Reduced hemispherical asymmetry was found in the sensorimotor and visual areas in expert musicians and athletes compared with non-experts. Moreover, increased expertise with faces, body, and objects resulted in an enhanced engagement of the occipito-temporal cortex (OTC) during stimulus observation. The present study aimed at investigating whether intense and extended practice with dance would result in an enhanced symmetric response of OTC at an early stage of action processing. Expert ballet dancers and non-dancer controls were presented with videos depicting ballet steps during EEG recording. The observation of the moving dancer elicited a posterior N2 component, being larger over the left hemisphere in dancers than controls. The source reconstruction (swLORETA) of the negativity showed the engagement of the bilateral inferior and middle temporal regions in experts, while right-lateralized activity was found in controls. The dancers also showed an early P2 and enhanced P300 responses, indicating faster stimulus processing and subsequent recognition. This evidence seemed to suggest expertise-related increased sensitivity of the OTC in encoding body kinematics. Thus, we speculated that long-term whole-body practice would result in enriched and refined action processin

    Auditory Selective Attention to Speech Modulates Activity in the Visual Word Form Area

    Get PDF
    Selective attention to speech versus nonspeech signals in complex auditory input could produce top-down modulation of cortical regions previously linked to perception of spoken, and even visual, words. To isolate such top-down attentional effects, we contrasted 2 equally challenging active listening tasks, performed on the same complex auditory stimuli (words overlaid with a series of 3 tones). Instructions required selectively attending to either the speech signals (in service of rhyme judgment) or the melodic signals (tone-triplet matching). Selective attention to speech, relative to attention to melody, was associated with blood oxygenation level-dependent (BOLD) increases during functional magnetic resonance imaging (fMRI) in left inferior frontal gyrus, temporal regions, and the visual word form area (VWFA). Further investigation of the activity in visual regions revealed overall deactivation relative to baseline rest for both attention conditions. Topographic analysis demonstrated that while attending to melody drove deactivation equivalently across all fusiform regions of interest examined, attending to speech produced a regionally specific modulation: deactivation of all fusiform regions, except the VWFA. Results indicate that selective attention to speech can topographically tune extrastriate cortex, leading to increased activity in VWFA relative to surrounding regions, in line with the well-established connectivity between areas related to spoken and visual word perception in skilled reader

    Object Repetition Leads to Local Increases in the Temporal Coordination of Neural Responses

    Get PDF
    Experience with visual objects leads to later improvements in identification speed and accuracy (“repetition priming”), but generally leads to reductions in neural activity in single-cell recording studies in animals and fMRI studies in humans. Here we use event-related, source-localized MEG (ER-SAM) to evaluate the possibility that neural activity changes related to priming in occipital, temporal, and prefrontal cortex correspond to more temporally coordinated and synchronized activity, reflected in local increases in the amplitude of low-frequency activity fluctuations (i.e. evoked power) that are time-locked to stimulus onset. Subjects (N = 17) identified pictures of objects that were either novel or repeated during the session. Tests in two separate low-frequency bands (theta/alpha: 5–15 Hz; beta: 15–35 Hz) revealed increases in evoked power (5–15 Hz) for repeated stimuli in the right fusiform gyrus, with the earliest significant increases observed 100–200 ms after stimulus onset. Increases with stimulus repetition were also observed in striate/extrastriate cortex (15–35 Hz) by 200–300 ms post-stimulus, along with a trend for a similar pattern in right lateral prefrontal cortex (5–15 Hz). Our results suggest that experience-dependent reductions in neural activity may affect improved behavioral identification through more coordinated, synchronized activity at low frequencies, constituting a mechanism for more efficient neural processing with experience

    High baseline activity in inferior temporal cortex improves neural and behavioral discriminability during visual categorization

    Get PDF
    Spontaneous firing is a ubiquitous property of neural activity in the brain. Recent literature suggests that this baseline activity plays a key role in perception. However, it is not known how the baseline activity contributes to neural coding and behavior. Here, by recording from the single neurons in the inferior temporal cortex of monkeys performing a visual categorization task, we thoroughly explored the relationship between baseline activity, the evoked response, and behavior. Specifically we found that a low-frequency (<8 Hz) oscillation in the spike train, prior and phase-locked to the stimulus onset, was correlated with increased gamma power and neuronal baseline activity. This enhancement of the baseline activity was then followed by an increase in the neural selectivity and the response reliability and eventually a higher behavioral performance.Iran National Science Foundation (INSF
    corecore