2 research outputs found

    Audiovisual temporal integration in reverberant environments

    Get PDF
    AbstractWith teleconferencing becoming more accessible as a communication platform, researchers are working to understand the consequences of the interaction between human perception and this unfamiliar environment. Given the enclosed space of a teleconference room, along with the physical separation between the user, microphone and speakers, the transmitted audio often becomes mixed with the reverberating auditory components from the room. As a result, the audio can be perceived as smeared in time, and this can affect the user experience and perceived quality. Moreover, other challenges remain to be solved. For instance, during encoding, compression and transmission, the audio and video streams are typically treated separately. Consequently, the signals are rarely perfectly aligned and synchronous. In effect, timing affects both reverberation and audiovisual synchrony, and the two challenges may well be inter-dependent. This study explores the temporal integration of audiovisual continuous speech and speech syllables, along with a non-speech event, across a range of asynchrony levels for different reverberation conditions. Non-reverberant stimuli are compared to stimuli with added reverberation recordings. Findings reveal that reverberation does not affect the temporal integration of continuous speech. However, reverberation influences the temporal integration of the isolated speech syllables and the action-oriented event, with perceived subjective synchrony skewed towards audio lead asynchrony and away from the more common audio lag direction. Furthermore, less time is spent on simultaneity judgements for the longer sequences when the temporal offsets get longer and when reverberation is introduced, suggesting that both asynchrony and reverberation add to the demands of the task
    corecore