103 research outputs found

    The effects of stereo disparity on the behavioural and electrophysiological correlates of audio-visual motion in depth.

    Get PDF
    Motion is represented by low-level signals, such as size-expansion in vision or loudness changes in the auditory modality. The visual and auditory signals from the same object or event may be integrated and facilitate detection. We explored behavioural and electrophysiological correlates of congruent and incongruent audio-visual depth motion in conditions where auditory level changes, visual expansion, and visual disparity cues were manipulated. In Experiment 1 participants discriminated auditory motion direction whilst viewing looming or receding, 2D or 3D, visual stimuli. Responses were faster and more accurate for congruent than for incongruent audio-visual cues, and the congruency effect (i.e., difference between incongruent and congruent conditions) was larger for visual 3D cues compared to 2D cues. In Experiment 2, event-related potentials (ERPs) were collected during presentation of the 2D and 3D, looming and receding, audio-visual stimuli, while participants detected an infrequent deviant sound. Our main finding was that audio-visual congruity was affected by retinal disparity at an early processing stage (135 – 160 ms) over occipito-parietal scalp. Topographic analyses suggested that similar brain networks were activated for the 2D and 3D congruity effects, but that cortical responses were stronger in the 3D condition. Differences between congruent and incongruent conditions were observed between 140 – 200 ms, 220 – 280 ms, and 350 – 500 ms after stimulus onset

    Effects of stimulus duration on audio-visual synchrony perception

    Get PDF
    The integration of visual and auditory inputs in the human brain occurs only if the components are perceived in temporal proximity, that is, when the intermodal time difference falls within the so-called subjective synchrony range. We used the midpoint of this range to estimate the point of subjective simultaneity (PSS). We measured the PSS for audio-visual (AV) stimuli in a synchrony judgment task, in which subjects had to judge a given AV stimulus using three response categories (audio first, synchronous, video first). The relevant stimulus manipulation was the duration of the auditory and visual components. Results for unimodal auditory and visual stimuli have shown that the perceived onset shifts to relatively later positions with increasing stimulus duration. These unimodal shifts should be reflected in changing PSS values, when AV stimuli with different durations of the auditory and visual components are used. The results for 17 subjects showed indeed a significant shift of the PSS for different duration combinations of the stimulus components. Because the shifts were approximately equal for duration changes in either of the components, no net shift of the PSS was observed as long as the durations of the two components were equal. This result indicates the need to appropriately account for unimodal timing effects when quantifying intermodal synchrony perceptio

    Audiovisual Segregation in Cochlear Implant Users

    Get PDF
    It has traditionally been assumed that cochlear implant users de facto perform atypically in audiovisual tasks. However, a recent study that combined an auditory task with visual distractors suggests that only those cochlear implant users that are not proficient at recognizing speech sounds might show abnormal audiovisual interactions. The present study aims at reinforcing this notion by investigating the audiovisual segregation abilities of cochlear implant users in a visual task with auditory distractors. Speechreading was assessed in two groups of cochlear implant users (proficient and non-proficient at sound recognition), as well as in normal controls. A visual speech recognition task (i.e. speechreading) was administered either in silence or in combination with three types of auditory distractors: i) noise ii) reverse speech sound and iii) non-altered speech sound. Cochlear implant users proficient at speech recognition performed like normal controls in all conditions, whereas non-proficient users showed significantly different audiovisual segregation patterns in both speech conditions. These results confirm that normal-like audiovisual segregation is possible in highly skilled cochlear implant users and, consequently, that proficient and non-proficient CI users cannot be lumped into a single group. This important feature must be taken into account in further studies of audiovisual interactions in cochlear implant users

    Disentangling stimulus plausibility and contextual congruency: Electrophysiological evidence for differential cognitive dynamics

    Get PDF
    Expectancy mechanisms are routinely used by the cognitive system in stimulus processing and in anticipation of appropriate responses. Electrophysiology research has documented negative shifts of brain activity when expectancies are violated within a local stimulus context (e.g., reading an implausible word in a sentence) or more globally between consecutive stimuli (e.g., a narrative of images with an incongruent end). In this EEG study, we examine the interaction between expectancies operating at the level of stimulus plausibility and at more global level of contextual congruency to provide evidence for, or against, a disassociation of the underlying processing mechanisms. We asked participants to verify the congruency of pairs of cross-modal stimuli (a sentence and a scene), which varied in plausibility. ANOVAs on ERP amplitudes in selected windows of interest show that congruency violation has longer-lasting (from 100 to 500 ms) and more widespread effects than plausibility violation (from 200 to 400 ms). We also observed critical interactions between these factors, whereby incongruent and implausible pairs elicited stronger negative shifts than their congruent counterpart, both early on (100-200 ms) and between 400-500 ms. Our results suggest that the integration mechanisms are sensitive to both global and local effects of expectancy in a modality independent manner. Overall, we provide novel insights into the interdependence of expectancy during meaning integration of cross-modal stimuli in a verification task.Fundacao para a Ciencia e Tecnologia [SFRH/BPD/88374/2012, PTDC/PSI-PCO/110734/2009, UID/BIM/04773/2013 CBMR 1334, PEst-OE/EQB/LA0023/2013, UID/PSI/00050/2013]; Leverhulme Trust [ECF-2014-205]; Max Planck Institute for Psycholinguistics; Donders Institute for Brain, Cognition and Behaviourhttp://creativecommons.org/licenses/by/4.0

    Multisensory Integration and Attention in Autism Spectrum Disorder: Evidence from Event-Related Potentials

    Get PDF
    Successful integration of various simultaneously perceived perceptual signals is crucial for social behavior. Recent findings indicate that this multisensory integration (MSI) can be modulated by attention. Theories of Autism Spectrum Disorders (ASDs) suggest that MSI is affected in this population while it remains unclear to what extent this is related to impairments in attentional capacity. In the present study Event-related potentials (ERPs) following emotionally congruent and incongruent face-voice pairs were measured in 23 high-functioning, adult ASD individuals and 24 age- and IQ-matched controls. MSI was studied while the attention of the participants was manipulated. ERPs were measured at typical auditory and visual processing peaks, namely, P2 and N170. While controls showed MSI during divided attention and easy selective attention tasks, individuals with ASD showed MSI during easy selective attention tasks only. It was concluded that individuals with ASD are able to process multisensory emotional stimuli, but this is differently modulated by attention mechanisms in these participants, especially those associated with divided attention. This atypical interaction between attention and MSI is also relevant to treatment strategies, with training of multisensory attentional control possibly being more beneficial than conventional sensory integration therapy

    Sensory information in perceptual-motor sequence learning: visual and/or tactile stimuli

    Get PDF
    Sequence learning in serial reaction time (SRT) tasks has been investigated mostly with unimodal stimulus presentation. This approach disregards the possibility that sequence acquisition may be guided by multiple sources of sensory information simultaneously. In the current study we trained participants in a SRT task with visual only, tactile only, or bimodal (visual and tactile) stimulus presentation. Sequence performance for the bimodal and visual only training groups was similar, while both performed better than the tactile only training group. In a subsequent transfer phase, participants from all three training groups were tested in conditions with visual, tactile, and bimodal stimulus presentation. Sequence performance between the visual only and bimodal training groups again was highly similar across these identical stimulus conditions, indicating that the addition of tactile stimuli did not benefit the bimodal training group. Additionally, comparing across identical stimulus conditions in the transfer phase showed that the lesser sequence performance from the tactile only group during training probably did not reflect a difference in sequence learning but rather just a difference in expression of the sequence knowledge

    Intermodal attention affects the processing of the temporal alignment of audiovisual stimuli

    Get PDF
    The temporal asynchrony between inputs to different sensory modalities has been shown to be a critical factor influencing the interaction between such inputs. We used scalp-recorded event-related potentials (ERPs) to investigate the effects of attention on the processing of audiovisual multisensory stimuli as the temporal asynchrony between the auditory and visual inputs varied across the audiovisual integration window (i.e., up to 125 ms). Randomized streams of unisensory auditory stimuli, unisensory visual stimuli, and audiovisual stimuli (consisting of the temporally proximal presentation of the visual and auditory stimulus components) were presented centrally while participants attended to either the auditory or the visual modality to detect occasional target stimuli in that modality. ERPs elicited by each of the contributing sensory modalities were extracted by signal processing techniques from the combined ERP waveforms elicited by the multisensory stimuli. This was done for each of the five different 50-ms subranges of stimulus onset asynchrony (SOA: e.g., V precedes A by 125–75 ms, by 75–25 ms, etc.). The extracted ERPs for the visual inputs of the multisensory stimuli were compared among each other and with the ERPs to the unisensory visual control stimuli, separately when attention was directed to the visual or to the auditory modality. The results showed that the attention effects on the right-hemisphere visual P1 was largest when auditory and visual stimuli were temporally aligned. In contrast, the N1 attention effect was smallest at this latency, suggesting that attention may play a role in the processing of the relative temporal alignment of the constituent parts of multisensory stimuli. At longer latencies an occipital selection negativity for the attended versus unattended visual stimuli was also observed, but this effect did not vary as a function of SOA, suggesting that by that latency a stable representation of the auditory and visual stimulus components has been established

    The Role of Sensorimotor Difficulties in Autism Spectrum Conditions

    Get PDF
    AbstractIn addition to difficulties in social communication, current diagnostic criteria for autism spectrum conditions (ASC) also incorporate sensorimotor difficulties; repetitive motor movements and atypical reactivity to sensory input (APA, 2013). This paper explores whether sensorimotor difficulties are associated with the development and maintenance of symptoms in ASC. Firstly, studies have shown difficulties coordinating sensory input into planning and executing movement effectively in ASC. Secondly, studies have shown associations between sensory reactivity and motor coordination with core ASC symptoms, suggesting these areas each strongly influence the development of social and communication skills. Thirdly, studies have begun to demonstrate that sensorimotor difficulties in ASC could account for reduced social attention early in development, with a cascading effect on later social, communicative and emotional development. These results suggest that sensorimotor difficulties not only contribute to non-social difficulties such as narrow circumscribed interests, but also to the development of social behaviours such as effectively coordinating eye contact with speech and gesture, interpreting others’ behaviour and responding appropriately. Further research is needed to explore the link between sensory and motor difficulties in ASC, and their contribution to the development and maintenance of ASC
    corecore