8 research outputs found

    The integration of continuous audio and visual speech in a cocktail-party environment depends on attention

    No full text
    In noisy environments, our ability to understand speech benefits greatly from seeing the speaker's face. This is attributed to the brain's ability to integrate audio and visual information, a process known as multisensory integration. In addition, selective attention plays an enormous role in what we understand, the so-called cocktail-party phenomenon. But how attention and multisensory integration interact remains incompletely understood, particularly in the case of natural, continuous speech. Here, we addressed this issue by analyzing EEG data recorded from participants who undertook a multisensory cocktail-party task using natural speech. To assess multisensory integration, we modeled the EEG responses to the speech in two ways. The first assumed that audiovisual speech processing is simply a linear combination of audio speech processing and visual speech processing (i.e., an A + V model), while the second allows for the possibility of audiovisual interactions (i.e., an AV model). Applying these models to the data revealed that EEG responses to attended audiovisual speech were better explained by an AV model, providing evidence for multisensory integration. In contrast, unattended audiovisual speech responses were best captured using an A + V model, suggesting that multisensory integration is suppressed for unattended speech. Follow up analyses revealed some limited evidence for early multisensory integration of unattended AV speech, with no integration occurring at later levels of processing. We take these findings as evidence that the integration of natural audio and visual speech occurs at multiple levels of processing in the brain, each of which can be differentially affected by attention

    Interaction between Space and Effectiveness in Multisensory Integration: Behavioral and Perceptual Measures

    No full text
    Previous research has described several core principles of multisensory integration. These include the spatial principle, which relates the integrative product to the physical location of the stimuli and the principle of inverse effectiveness, in which minimally effective stimuli elicit the greatest multisensory gains when combined. In the vast majority of prior studies, these principles have been studied in isolation, with little attention to their interrelationships and possible interactions. Recent neurophysiological studies in our laboratory have begun to examine these interactions within individual neurons in animal models, work that we extend here into the realm of human performance and perception. To test this, we conducted a psychophysical experiment in which 51 participants were tasked with judging the location of a target stimulus. Target stimuli were visual flashes and auditory noise bursts presented either alone or in combination at four locations and at two intensities. Multisensory combinations were always spatially coincident. A significant effect was found for response times and a marginal effect was found for accuracy, such that the degree of multisensory gain changed as a function of the interaction between space and effectiveness. These results provide further evidence for a strong interrelationship between the multisensory principles in dictating human performance

    Multisensory Interactions across Spatial Location and Temporal Synchrony

    No full text
    The process of integrating information across sensory modalities is highly dependent upon a number of stimulus characteristics, including spatial and temporal coincidence, as well as effectiveness. Typically, these properties have been studied in isolation, but recent evidence suggests that they are interactive. This study focuses on interactions between the spatial location and temporal synchrony of stimuli. Participants were presented with simple audiovisual in parametrically varied locations, and with parametrically varied stimulus onset asynchronies (SOAs). Participants performed spatial location and perceived simultaneity tasks (PSS). Accuracies and response times were measured. Accuracies of spatial localization were dependent upon spatial location, with no effect of SOA and interaction seen, however, RT analysis showed an effect of SOA and an interaction; more peripheral presentations showed greater slowing of RT in asynchronous conditions, and fewer violations of the race model. With the PSS tasks, effects of SOA and spatial location were found in the responses, as well as an interaction between the two. Peripheral stimuli were more likely to be judged as synchronous, a difference seen particularly with long SOAs. These results suggest that the commonly studied principles of integration are indeed interactive, and that these interactions have measureable behavioral implications
    corecore