125 research outputs found

    The neural basis of audiovisual integration

    Get PDF
    Our perception is continuous and unified. Yet, sensory information reaches our brains through different senses and needs to be processed in order to create that unified percept. Interactions between sensory modalities occur already at primary cortical levels. The purpose of such interactions and what kind of information they transmit is still largely unknown. The current thesis aimed to reveal the interactions between auditory pitch and visual size in polar coordinates, two modality specific stimulus features that have robust topographic representations in the human brain. In Chapter 1, I present the background of cross-modal interactions in early sensory cortices and of the pitch-size relationship. In Chapter 2, we explored the pitch-size relationship in a speeded classification task and, in Chapter 3, at the level of functional Magnetic Resonance Imaging activation patterns. In Chapter 4, we investigated the effects of actively learning a specific pitch-size mapping during one session on the speeded classification task. In Chapter 5, we extended learning over multiple sessions and examined learning effects with behavioral and neural measures. Finally, in Chapter 6, I summarize the findings of the thesis, its contributions to the literature, and outline directions for future research

    Temporal dynamics in the multisensory brain

    Get PDF
    Phd ThesisIn this work, I investigate the mechanisms with which the brain manages temporal coherence between sensory signals. An overview of relevant literature is given, and current theories about how sensory signals are combined in brain and behaviour are introduced. Key unknowns about the temporal dynamics of auditory-visual integration are identified and addressed within four investigations. In the first study, I assess whether cues to the onset of a auditory-visual pair affect sensitivity to their temporal asynchrony. It is shown that regularly timed cues shorten the temporal window of integration compared with irregular cues. This demonstrates that attention can affect how sensory signals are bound. In the second experiment, speech-like asynchronous stimuli are presented for an extended duration whilst perceptual simultaneity is monitored. In this manner, the time-course of temporal adaptation is tracked over time. Adaptation occurs when the presented asynchrony is visual-leading, but not when it is auditory-leading. This may suggest that temporal recalibration in the auditory-leading direction is not a consequence of adaptation. In the third investigation, the neural correlates of the time-course of temporal adaptation are measured. Increased activity in frontal and parietal areas occurred during perceptual asynchrony, this replicates previous work and further promotes that these regions provide top-down modulation of the mechanisms of temporal simultaneity. Increased activity is present in the posterior cingulate cortex whilst the brain is maintaining an adapted state, compared with during adaptation. This region may act as a con ict monitor and compensator for temporal asynchrony. Lastly, I investigate the extent to which a highly prevalent inhibitory neurotransmitter affects performance in a multisensory behavioural task. There is a possible correlation between the concentration of gamma-aminobutyric acid in the parietal lobe and the overall strength of integration effects. Finally, the impact and future directions of this work are discussed in the context of current literature.Wellcome Trust and the Institute of Neuroscienc

    Functional imaging studies of visual-auditory integration in man.

    Get PDF
    This thesis investigates the central nervous system's ability to integrate visual and auditory information from the sensory environment into unified conscious perception. It develops the possibility that the principle of functional specialisation may be applicable in the multisensory domain. The first aim was to establish the neuroanatomical location at which visual and auditory stimuli are integrated in sensory perception. The second was to investigate the neural correlates of visual-auditory synchronicity, which would be expected to play a vital role in establishing which visual and auditory stimuli should be perceptually integrated. Four functional Magnetic Resonance Imaging studies identified brain areas specialised for: the integration of dynamic visual and auditory cues derived from the same everyday environmental events (Experiment 1), discriminating relative synchronicity between dynamic, cyclic, abstract visual and auditory stimuli (Experiment 2 & 3) and the aesthetic evaluation of visually and acoustically perceived art (Experiment 4). Experiment 1 provided evidence to suggest that the posterior temporo-parietal junction may be an important site of crossmodal integration. Experiment 2 revealed for the first time significant activation of the right anterior frontal operculum (aFO) when visual and auditory stimuli cycled asynchronously. Experiment 3 confirmed and developed this observation as the right aFO was activated only during crossmodal (visual-auditory), but not intramodal (visual-visual, auditory-auditory) asynchrony. Experiment 3 also demonstrated activation of the amygdala bilaterally during crossmodal synchrony. Experiment 4 revealed the neural correlates of supramodal, contemplative, aesthetic evaluation within the medial fronto-polar cortex. Activity at this locus varied parametrically according to the degree of subjective aesthetic beauty, for both visual art and musical extracts. The most robust finding of this thesis is that activity in the right aFO increases when concurrently perceived visual and auditory sensory stimuli deviate from crossmodal synchrony, which may veto the crossmodal integration of unrelated stimuli into unified conscious perception

    The neural basis of audio-visual integration and adaptation

    Get PDF
    The brain integrates or segregates audio-visual signals effortlessly in everyday life. In order to do so, it needs to infer the causal structure by which the signals were generated. Although behavioural studies extensively characterized causal inference in audio-visual perception, the neural mechanisms are barely explored. The current thesis sheds light on these neural processes and demonstrates how the brain adapts to dynamic as well as long-term changes in the environmental statistics of audio-visual signals. In Chapter 1, I introduce the causal inference problem and demonstrate how spatial audiovisual signals are integrated at the behavioural as well as neural level. In Chapter 2, I describe methodological foundations for the following empirical chapters. In Chapter 3, I present the neural mechanisms of explicit causal inference and the representations of audio-visual space along the human cortical hierarchy. Chapter 4 reveals that the brain is able to use recent past to adapt to the dynamically changing environment. In Chapter 5, I discuss the neural substrates of encoding auditory space and its adaptive changes in response to spatially conflicting visual signals. Finally, in Chapter 6, I summarize the findings of the thesis, its contributions to the literature, and I outline directions for future research

    Exploring the neural entrainment to musical rhythms and meter : a steady-state evoked potential approach

    Full text link
    Thèse de doctorat réalisé en cotutelle avec l'Université catholique de Louvain, Belgique (Faculté de médecine, Institut de Neuroscience)Percevoir et synchroniser ses mouvements à une pulsation régulière en musique est une capacité largement répandue chez l’Homme, et fondamentale aux comportements musicaux. La pulsation et la métrique en musique désignent généralement une organisation temporelle périodique perçue à partir de stimuli acoustiques complexes, et cette organisation perceptuelle implique souvent une mise en mouvement périodique spontanée du corps. Cependant, les mécanismes neuraux sous-tendant cette perception sont à l’heure actuelle encore méconnus. Le présent travail a donc eu pour objectif de développer une nouvelle approche expérimentale, inspirée par l’approche électrophysiologique des potentiels évoqués stationnaires, afin d’explorer les corrélats neuraux à la base de notre perception de la pulsation et de la métrique induite à l’écoute de rythmes musicaux. L’activité neurale évoquée en relation avec la perception d’une pulsation a été enregistrée par électroencéphalographie (EEG) chez des individus sains, dans divers contextes : (1) dans un contexte d’imagerie mentale d’une métrique appliquée de manière endogène sur un stimulus auditif, (2) dans un contexte d’induction spontanée d’une pulsation à l’écoute de patterns rythmiques musicaux, (3) dans un contexte d’interaction multisensorielle, et (4) dans un contexte de synchronisation sensorimotrice. Pris dans leur ensemble, les résultats de ces études corroborent l’hypothèse selon laquelle la perception de la pulsation en musique est sous-tendue par des processus de synchronisation et de résonance de l’activité neurale dans le cerveau humain. De plus, ces résultats suggèrent que l’approche développée dans le présent travail pourrait apporter un éclairage significatif pour comprendre les mécanismes neuraux de la perception de la pulsation et des rythmes musicaux, et, dans une perspective plus générale, pour explorer les mécanismes de synchronisation neurale.The ability to perceive a regular beat in music and synchronize to it is a widespread human skill. Fundamental to musical behavior, beat and meter refer to the perception of periodicities while listening to musical rhythms, and usually involve spontaneous entrainment to move on these periodicities. However, the neural mechanisms underlying entrainment to beat and meter in Humans remain unclear. The present work tests a novel experimental approach, inspired by the steady-state evoked potential method, to explore the neural dynamics supporting the perception of rhythmic inputs. Using human electroencephalography (EEG), neural responses to beat and meter were recorded in various contexts: (1) mental imagery of meter, (2) spontaneous induction of a beat from rhythmic patterns, (3) multisensory integration, and (4) sensorimotor synchronization. Our results support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. Furthermore, our results suggest that this novel approach could help investigating the link between the phenomenology of musical beat and meter and neurophysiological evidence of a bias towards periodicities arising under certain circumstances in the nervous system. Hence, entrainment to music provides an original framework to explore general entrainment phenomena occurring at various levels, from the inter-neural to the inter-individual level

    Multisensory Networks in Primary and Association Cortices in Cat

    Get PDF

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    The role of multisensory integration in the bottom-up and top-down control of attentional object selection

    Get PDF
    Selective spatial attention and multisensory integration have been traditionally considered as separate domains in psychology and cognitive neuroscience. However, theoretical and methodological advancements in the last two decades have paved the way for studying different types of interactions between spatial attention and multisensory integration. In the present thesis, two types of such interactions are investigated. In the first part of the thesis, the role of audiovisual synchrony as a source of bottom-up bias in visual selection was investigated. In six out of seven experiments, a variant of the spatial cueing paradigm was used to compare attentional capture by visual and audiovisual distractors. In another experiment, single-frame search arrays were presented to investigate whether multisensory integration can bias spatial selection via salience-based mechanisms. Behavioural and electrophysiological results demonstrated that the ability of visual objects to capture attention was enhanced when they were accompanied by noninformative auditory signals. They also showed evidence for the bottom-up nature of these audiovisual enhancements of attentional capture by revealing that these enhancements occurred irrespective of the task-relevance of visual objects. In the second part of this thesis, four experiments are reported that investigated the spatial selection of audiovisual relative to visual objects and the guidance of their selection by bimodal object templates. Behavioural and ERP results demonstrated that the ability of task-irrelevant target-matching visual objects to capture attention was reduced during search for audiovisual as compared to purely visual targets, suggesting that bimodal search is guided by integrated audiovisual templates. However, the observation that unimodal targetmatching visual events retained some ability to capture attention indicates that bimodal search is controlled to some extent by modality-specific representations of task-relevant information. In summary, the present thesis has contributed to our knowledge of how attention is controlled in real-life environments by demonstrating that spatial selective attention can be biased towards bimodal objects via salience-driven as well as goal-based mechanisms

    Contextual signals in visual cortex:How sounds, state, and task setting shape how we see

    Get PDF
    What we see is not always what we get. Even though the light that hits the retina might convey the same images, how visual information is processed and what we eventually do with it depend on many contextual factors. In this thesis, we show in a series of experiments how the sensory processing of the same visual input in the visual cortex of mice is affected by our internal state, movements, other senses and any task we are performing. We found that recurrent activity originating within higher visual areas modulates activity in the primary visual cortex (V1) and selectivity amplifies weak compared to strong sensory-evoked responses. Second, visual stimuli evoked similar early activity in V1, but later activity strongly depended on whether mice were trained to report the visual stimuli, and on the specific task. Specifically, adding a second modality to the task demands extended the temporal window during which V1 was causally involved in visual perception. Third, we report that not only visual stimuli but also sounds led to strong responses in V1, composed of distinct auditory-related and motor-related activity. Finally, we studied the role of Posterior Parietal Cortex in an audiovisual change detection task. Despite extensive single-neuron and population-level encoding of task-relevant visual and auditory stimuli, as well as upcoming behavioral responses, optogenetic inactivation did not affect task performance. Whereas these contextual factors have previously been studied in isolation, we obtain a more integrated understanding of how factors beyond visual information determine what we actually see

    The role of multisensory integration in the bottom-up and top-down control of attentional object selection

    Get PDF
    Selective spatial attention and multisensory integration have been traditionally considered as separate domains in psychology and cognitive neuroscience. However, theoretical and methodological advancements in the last two decades have paved the way for studying different types of interactions between spatial attention and multisensory integration. In the present thesis, two types of such interactions are investigated. In the first part of the thesis, the role of audiovisual synchrony as a source of bottom-up bias in visual selection was investigated. In six out of seven experiments, a variant of the spatial cueing paradigm was used to compare attentional capture by visual and audiovisual distractors. In another experiment, single-frame search arrays were presented to investigate whether multisensory integration can bias spatial selection via salience-based mechanisms. Behavioural and electrophysiological results demonstrated that the ability of visual objects to capture attention was enhanced when they were accompanied by noninformative auditory signals. They also showed evidence for the bottom-up nature of these audiovisual enhancements of attentional capture by revealing that these enhancements occurred irrespective of the task-relevance of visual objects. In the second part of this thesis, four experiments are reported that investigated the spatial selection of audiovisual relative to visual objects and the guidance of their selection by bimodal object templates. Behavioural and ERP results demonstrated that the ability of task-irrelevant target-matching visual objects to capture attention was reduced during search for audiovisual as compared to purely visual targets, suggesting that bimodal search is guided by integrated audiovisual templates. However, the observation that unimodal targetmatching visual events retained some ability to capture attention indicates that bimodal search is controlled to some extent by modality-specific representations of task-relevant information. In summary, the present thesis has contributed to our knowledge of how attention is controlled in real-life environments by demonstrating that spatial selective attention can be biased towards bimodal objects via salience-driven as well as goal-based mechanisms
    • …
    corecore