The neural basis of audio-visual integration and adaptation

Abstract

The brain integrates or segregates audio-visual signals effortlessly in everyday life. In order to do so, it needs to infer the causal structure by which the signals were generated. Although behavioural studies extensively characterized causal inference in audio-visual perception, the neural mechanisms are barely explored. The current thesis sheds light on these neural processes and demonstrates how the brain adapts to dynamic as well as long-term changes in the environmental statistics of audio-visual signals. In Chapter 1, I introduce the causal inference problem and demonstrate how spatial audiovisual signals are integrated at the behavioural as well as neural level. In Chapter 2, I describe methodological foundations for the following empirical chapters. In Chapter 3, I present the neural mechanisms of explicit causal inference and the representations of audio-visual space along the human cortical hierarchy. Chapter 4 reveals that the brain is able to use recent past to adapt to the dynamically changing environment. In Chapter 5, I discuss the neural substrates of encoding auditory space and its adaptive changes in response to spatially conflicting visual signals. Finally, in Chapter 6, I summarize the findings of the thesis, its contributions to the literature, and I outline directions for future research

    Similar works