12 research outputs found

    Multisensory Integration and Attention in Autism Spectrum Disorder: Evidence from Event-Related Potentials

    Get PDF
    Successful integration of various simultaneously perceived perceptual signals is crucial for social behavior. Recent findings indicate that this multisensory integration (MSI) can be modulated by attention. Theories of Autism Spectrum Disorders (ASDs) suggest that MSI is affected in this population while it remains unclear to what extent this is related to impairments in attentional capacity. In the present study Event-related potentials (ERPs) following emotionally congruent and incongruent face-voice pairs were measured in 23 high-functioning, adult ASD individuals and 24 age- and IQ-matched controls. MSI was studied while the attention of the participants was manipulated. ERPs were measured at typical auditory and visual processing peaks, namely, P2 and N170. While controls showed MSI during divided attention and easy selective attention tasks, individuals with ASD showed MSI during easy selective attention tasks only. It was concluded that individuals with ASD are able to process multisensory emotional stimuli, but this is differently modulated by attention mechanisms in these participants, especially those associated with divided attention. This atypical interaction between attention and MSI is also relevant to treatment strategies, with training of multisensory attentional control possibly being more beneficial than conventional sensory integration therapy

    Intermodal attention affects the processing of the temporal alignment of audiovisual stimuli

    Get PDF
    The temporal asynchrony between inputs to different sensory modalities has been shown to be a critical factor influencing the interaction between such inputs. We used scalp-recorded event-related potentials (ERPs) to investigate the effects of attention on the processing of audiovisual multisensory stimuli as the temporal asynchrony between the auditory and visual inputs varied across the audiovisual integration window (i.e., up to 125 ms). Randomized streams of unisensory auditory stimuli, unisensory visual stimuli, and audiovisual stimuli (consisting of the temporally proximal presentation of the visual and auditory stimulus components) were presented centrally while participants attended to either the auditory or the visual modality to detect occasional target stimuli in that modality. ERPs elicited by each of the contributing sensory modalities were extracted by signal processing techniques from the combined ERP waveforms elicited by the multisensory stimuli. This was done for each of the five different 50-ms subranges of stimulus onset asynchrony (SOA: e.g., V precedes A by 125–75 ms, by 75–25 ms, etc.). The extracted ERPs for the visual inputs of the multisensory stimuli were compared among each other and with the ERPs to the unisensory visual control stimuli, separately when attention was directed to the visual or to the auditory modality. The results showed that the attention effects on the right-hemisphere visual P1 was largest when auditory and visual stimuli were temporally aligned. In contrast, the N1 attention effect was smallest at this latency, suggesting that attention may play a role in the processing of the relative temporal alignment of the constituent parts of multisensory stimuli. At longer latencies an occipital selection negativity for the attended versus unattended visual stimuli was also observed, but this effect did not vary as a function of SOA, suggesting that by that latency a stable representation of the auditory and visual stimulus components has been established

    No rapid audiovisual recalibration in adults on the autism spectrum

    Get PDF
    Autism spectrum disorders (ASD) are characterized by difficulties in social cognition, but are also associated with atypicalities in sensory and perceptual processing. Several groups have reported that autistic individuals show reduced integration of socially relevant audiovisual signals, which may contribute to the higher-order social and cognitive difficulties observed in autism. Here we use a newly devised technique to study instantaneous adaptation to audiovisual asynchrony in autism. Autistic and typical participants were presented with sequences of brief visual and auditory stimuli, varying in asynchrony over a wide range, from 512 ms auditory-lead to 512 ms auditory-lag, and judged whether they seemed to be synchronous. Typical adults showed strong adaptation effects, with trials proceeded by an auditory-lead needing more auditory-lead to seem simultaneous, and vice versa. However, autistic observers showed little or no adaptation, although their simultaneity curves were as narrow as the typical adults. This result supports recent Bayesian models that predict reduced adaptation effects in autism. As rapid audiovisual recalibration may be fundamental for the optimisation of speech comprehension, recalibration problems could render language processing more difficult in autistic individuals, hindering social communication

    Neural correlates of socio-emotional perception in 22q11.2 deletion syndrome.

    Get PDF
    BACKGROUND: Social impairments are described as a common feature of the 22q11.2 deletion syndrome (22q11DS). However, the neural correlates underlying these impairments are largely unknown in this population. In this study, we investigated neural substrates of socio-emotional perception. METHODS: We used event-related functional magnetic resonance imaging (fMRI) to explore neural activity in individuals with 22q11DS and healthy controls during the visualization of stimuli varying in social (social or non-social) or emotional (positive or negative valence) content. RESULTS: Neural hyporesponsiveness in regions of the default mode network (inferior parietal lobule, precuneus, posterior and anterior cingulate cortex and frontal regions) in response to social versus non-social images was found in the 22q11DS population compared to controls. A similar pattern of activation for positive and negative emotional processing was observed in the two groups. No correlation between neural activation and social functioning was observed in patients with the 22q11DS. Finally, no social Ă— valence interaction impairment was found in patients. CONCLUSIONS: Our results indicate atypical neural correlates of social perception in 22q11DS that appear to be independent of valence processing. Abnormalities in the social perception network may lead to social impairments observed in 22q11DS individuals

    The development of spontaneous facial responses to others’ emotions in infancy. An EMG study

    Get PDF
    Viewing facial expressions often evokes facial responses in the observer. These spontaneous facial reactions (SFRs) are believed to play an important role for social interactions. However, their developmental trajectory and the underlying neurocognitive mechanisms are still little understood. In the current study, 4- and 7-month old infants were presented with facial expressions of happiness, anger, and fear. Electromyography (EMG) was used to measure activation in muscles relevant for forming these expressions: zygomaticus major (smiling), corrugator supercilii (frowning), and frontalis (forehead raising). The results indicated no selective activation of the facial muscles for the expressions in 4-month-old infants. For 7-month-old infants, evidence for selective facial reactions was found especially for happy faces (leading to increased zygomaticus major activation) and fearful faces (leading to increased frontalis activation), while angry faces did not show a clear differential response. This suggests that emotional SFRs may be the result of complex neurocognitive mechanisms which lead to partial mimicry but are also likely to be influenced by evaluative processes. Such mechanisms seem to undergo important developments at least until the second half of the first year of life
    corecore