53 research outputs found

    Effects of Multimodal Load on Spatial Monitoring as Revealed by ERPs

    Get PDF
    While the role of selective attention in filtering out irrelevant information has been extensively studied, its characteristics and neural underpinnings when multiple environmental stimuli have to be processed in parallel are much less known. Building upon a dual-task paradigm that induced spatial awareness deficits for contralesional hemispace in right hemisphere-damaged patients, we investigated the electrophysiological correlates of multimodal load during spatial monitoring in healthy participants. The position of appearance of briefly presented, lateralized targets had to be reported either in isolation (single task) or together with a concurrent task, visual or auditory, which recruited additional attentional resources (dual-task). This top-down manipulation of attentional load, without any change of the sensory stimulation, modulated the amplitude of the first positive ERP response (P1) and shifted its neural generators, with a suppression of the signal in the early visual areas during both visual and auditory dual tasks. Furthermore, later N2 contralateral components elicited by left targets were particularly influenced by the concurrent visual task and were related to increased activation of the supramarginal gyrus. These results suggest that the right hemisphere is particularly affected by load manipulations, and confirm its crucial role in subtending automatic orienting of spatial attention and in monitoring both hemispaces

    The Second-Agent Effect: Communicative Gestures Increase the Likelihood of Perceiving a Second Agent

    Get PDF
    Background: Beyond providing cues about an agent’s intention, communicative actions convey information about the presence of a second agent towards whom the action is directed (second-agent information). In two psychophysical studies we investigated whether the perceptual system makes use of this information to infer the presence of a second agent when dealing with impoverished and/or noisy sensory input. Methodology/Principal Findings: Participants observed point-light displays of two agents (A and B) performing separate actions. In the Communicative condition, agent B’s action was performed in response to a communicative gesture by agent A. In the Individual condition, agent A’s communicative action was replaced with a non-communicative action. Participants performed a simultaneous masking yes-no task, in which they were asked to detect the presence of agent B. In Experiment 1, we investigated whether criterion c was lowered in the Communicative condition compared to the Individual condition, thus reflecting a variation in perceptual expectations. In Experiment 2, we manipulated the congruence between A’s communicative gesture and B’s response, to ascertain whether the lowering of c in the Communicative condition reflected a truly perceptual effect. Results demonstrate that information extracted from communicative gestures influences the concurrent processing of biological motion by prompting perception of a second agent (second-agent effect). Conclusions/Significance: We propose that this finding is best explained within a Bayesian framework, which gives

    Changes in Early Cortical Visual Processing Predict Enhanced Reactivity in Deaf Individuals

    Get PDF
    Individuals with profound deafness rely critically on vision to interact with their environment. Improvement of visual performance as a consequence of auditory deprivation is assumed to result from cross-modal changes occurring in late stages of visual processing. Here we measured reaction times and event-related potentials (ERPs) in profoundly deaf adults and hearing controls during a speeded visual detection task, to assess to what extent the enhanced reactivity of deaf individuals could reflect plastic changes in the early cortical processing of the stimulus. We found that deaf subjects were faster than hearing controls at detecting the visual targets, regardless of their location in the visual field (peripheral or peri-foveal). This behavioural facilitation was associated with ERP changes starting from the first detectable response in the striate cortex (C1 component) at about 80 ms after stimulus onset, and in the P1 complex (100–150 ms). In addition, we found that P1 peak amplitudes predicted the response times in deaf subjects, whereas in hearing individuals visual reactivity and ERP amplitudes correlated only at later stages of processing. These findings show that long-term auditory deprivation can profoundly alter visual processing from the earliest cortical stages. Furthermore, our results provide the first evidence of a co-variation between modified brain activity (cortical plasticity) and behavioural enhancement in this sensory-deprived population

    Predictions not commands: active inference in the motor system

    Full text link

    Occipital sleep spindles predict sequence learning in a visuo-motor task.

    No full text
    STUDY OBJECTIVES: The brain appears to use internal models to successfully interact with its environment via active predictions of future events. Both internal models and the predictions derived from them are based on previous experience. However, it remains unclear how previously encoded information is maintained to support this function, especially in the visual domain. In the present study, we hypothesized that sleep consolidates newly encoded spatio-temporal regularities to improve predictions afterwards. METHODS: We tested this hypothesis using a novel sequence-learning paradigm that aimed to dissociate perceptual from motor learning. We recorded behavioral performance and high-density electroencephalography (EEG) in male human participants during initial training and during testing two days later, following an experimental night of sleep (n = 16, including high-density EEG recordings) or wakefulness (n = 17). RESULTS: Our results show sleep-dependent behavioral improvements correlated with sleep-spindle activity specifically over occipital cortices. Moreover, event-related potential (ERP) responses indicate a shift of attention away from predictable to unpredictable sequences after sleep, consistent with an enhanced automaticity in the processing of predictable sequences. CONCLUSIONS: These findings suggest a sleep-dependent improvement in the prediction of visual sequences, likely related to visual cortex reactivation during sleep spindles. Considering that controls in our experiments did not fully exclude oculomotor contributions, future studies will need to address the extent to which these effects depend on purely perceptual versus oculomotor sequence learning

    Zur Wirksamkeit von Typhus-Paratyphus-Adsorbat-Impfstoffen

    No full text
    • …
    corecore