428 research outputs found

    Deficient auditory emotion processing but intact emotional multisensory integration in alexithymia

    Get PDF
    Alexithymia has been associated with emotion recognition deficits in both auditory and visual domains. Although emotions are inherently multimodal in daily life, little is known regarding abnormalities of emotional multisensory integration (eMSI) in relation to alexithymia. Here, we employed an emotional Stroop-like audiovisual task while recording event-related potentials (ERPs) in individuals with high alexithymia levels (HA) and low alexithymia levels (LA). During the task, participants had to indicate whether a voice was spoken in a sad or angry prosody while ignoring the simultaneously presented static face which could be either emotionally congruent or incongruent to the human voice. We found that HA performed worse and showed higher P2 amplitudes than LA independent of emotion congruency. Furthermore, difficulties in identifying and describing feelings were positively correlated with the P2 component, and P2 correlated negatively with behavioral performance. Bayesian statistics showed no group differences in eMSI and classical integration-related ERP components (N1 and N2). Although individuals with alexithymia indeed showed deficits in auditory emotion recognition as indexed by decreased performance and higher P2 amplitudes, the present findings suggest an intact capacity to integrate emotional information from multiple channels in alexithymia. Our work provides valuable insights into the relationship between alexithymia and neuropsychological mechanisms of emotional multisensory integration

    An Object-Based Interpretation of Audiovisual Processing

    Get PDF
    Visual cues help listeners follow conversation in a complex acoustic environment. Many audiovisual research studies focus on how sensory cues are combined to optimize perception, either in terms of minimizing the uncertainty in the sensory estimate or maximizing intelligibility, particularly in speech understanding. From an auditory perception perspective, a fundamental question that has not been fully addressed is how visual information aids the ability to select and focus on one auditory object in the presence of competing sounds in a busy auditory scene. In this chapter, audiovisual integration is presented from an object-based attention viewpoint. In particular, it is argued that a stricter delineation of the concepts of multisensory integration versus binding would facilitate a deeper understanding of the nature of how information is combined across senses. Furthermore, using an object-based theoretical framework to distinguish binding as a distinct form of multisensory integration generates testable hypotheses with behavioral predictions that can account for different aspects of multisensory interactions. In this chapter, classic multisensory illusion paradigms are revisited and discussed in the context of multisensory binding. The chapter also describes multisensory experiments that focus on addressing how visual stimuli help listeners parse complex auditory scenes. Finally, it concludes with a discussion of the potential mechanisms by which audiovisual processing might resolve competition between concurrent sounds in order to solve the cocktail party problem

    Multi-Sensory Human-Food Interaction

    Get PDF

    Multisensory Associative Learning and Multisensory Integration

    Get PDF
    Human multisensory experiences with the world rely on a combination of top-down and bottom-up influences, a process that changes throughout development. The present study explored the relationship between multisensory associative learning and multisensory integration using encephalography (EEG) and behavioural measures. While recording EEG activity, participants were exposed to novel pairings of non-sociolinguistic audiovisual stimuli of varying presentation probability while performing a detection task. The same stimuli were then used in another detection task, which was followed by an analogous behavioural speeded-response task, both of which kept probabilities equal and tested for multisensory integration. Significant relationships were found in fronto-central and occipital areas between late measures of associative learning and both early and late indices of multisensory integration in frontal and centro-parietal areas, respectively. Furthermore, a significant relationship was found between the behavioural and early neural index of multisensory integration. These results highlight the influence of higher-order processes, namely, learned associations on multisensory integration

    The neural basis of audio-visual integration and adaptation

    Get PDF
    The brain integrates or segregates audio-visual signals effortlessly in everyday life. In order to do so, it needs to infer the causal structure by which the signals were generated. Although behavioural studies extensively characterized causal inference in audio-visual perception, the neural mechanisms are barely explored. The current thesis sheds light on these neural processes and demonstrates how the brain adapts to dynamic as well as long-term changes in the environmental statistics of audio-visual signals. In Chapter 1, I introduce the causal inference problem and demonstrate how spatial audiovisual signals are integrated at the behavioural as well as neural level. In Chapter 2, I describe methodological foundations for the following empirical chapters. In Chapter 3, I present the neural mechanisms of explicit causal inference and the representations of audio-visual space along the human cortical hierarchy. Chapter 4 reveals that the brain is able to use recent past to adapt to the dynamically changing environment. In Chapter 5, I discuss the neural substrates of encoding auditory space and its adaptive changes in response to spatially conflicting visual signals. Finally, in Chapter 6, I summarize the findings of the thesis, its contributions to the literature, and I outline directions for future research

    Crossmodal correspondences: A tutorial review

    Full text link

    An object's smell in the multisensory brain : how our senses interact during olfactory object processing

    Get PDF
    Object perception is a remarkable and fundamental cognitive ability that allows us to interpret and interact with the world we are living in. In our everyday life, we constantly perceive objects–mostly without being aware of it and through several senses at the same time. Although it might seem that object perception is accomplished without any effort, the underlying neural mechanisms are anything but simple. How we perceive objects in the world surrounding us is the result of a complex interplay of our senses. The aim of the present thesis was to explore, by means of functional magnetic resonance imaging, how our senses interact when we perceive an object’s smell in a multisensory setting where the amount of sensory stimulation increases, as well as in a unisensory setting where we perceive an object’s smell in isolation. In Study I, we sought to determine whether and how multisensory object information influences the processing of olfactory object information in the posterior piriform cortex (PPC), a region linked to olfactory object encoding. In Study II, we then expanded our search for integration effects during multisensory object perception to the whole brain because previous research has demonstrated that multisensory integration is accomplished by a network of early sensory cortices and higher-order multisensory integration sites. We specifically aimed at determining whether there exist cortical regions that process multisensory object information independent of from which senses and from how many senses the information arises. In Study III, we then sought to unveil how our senses interact during olfactory object perception in a unisensory setting. Other previous studies have shown that even in such unisensory settings, olfactory object processing is not exclusively accomplished by regions within the olfactory system but instead engages a more widespread network of brain regions, such as regions belonging to the visual system. We aimed at determining what this visual engagement represents. That is, whether areas of the brain that are principally concerned with processing visual object information also hold neural representations of olfactory object information, and if so, whether these representations are similar for smells and pictures of the same objects. In Study I we demonstrated that assisting inputs from our senses of vision and hearing increase the processing of olfactory object information in the PPC, and that the more assisting input we receive the more the processing is enhanced. As this enhancement occurred only for matching inputs, it likely reflects integration of multisensory object information. Study II provided evidence for convergence of multisensory object information in form of a non-linear response enhancement in the inferior parietal cortex: activation increased for bimodal compared to unimodal stimulation, and increased even further for trimodal compared to bimodal stimulation. As this multisensory response enhancement occurred independent of the congruency of the incoming signals, it likely reflects a process of relating the incoming sensory information streams to each other. Finally, Study III revealed that regions of the ventral visual object stream are engaged in recognition of an object’s smell and represent olfactory object information in form of distinct neural activation patterns. While the visual system encodes information about both visual and olfactory objects, it appears to keep information from the two sensory modalities separate by representing smells and pictures of objects differently. Taken together, the studies included in this thesis reveal that olfactory object perception is a multisensory process that engages a widespread network of early sensory as well higher-order cortical regions, even if we do not encounter ourselves in a multisensory setting but exclusively perceive an object’s smell
    • …
    corecore