7 research outputs found

    Dynamic emotion perception and prior expectancy.

    Get PDF
    Social interactions require the ability to rapidly perceive emotion from various incoming dynamic, multisensory cues. Prior expectations reduce incoming emotional information and direct attention to cues that are aligned with what is expected. Studies to date have investigated the prior expectancy effect using static emotional images, despite the fact that dynamic stimuli would represent greater ecological validity. The objective of the study was to create a novel functional magnetic resonance imaging (fMRI) paradigm to examine the influence of prior expectations on naturalistic emotion perception. For this purpose, we developed a dynamic emotion perception task, which consisted of audio-visual videos that carry emotional information congruent or incongruent with prior expectations. The results show that emotional congruency was associated with activity in prefrontal regions, amygdala, and putamen, whereas emotional incongruency was associated with activity in temporoparietal junction and mid-cingulate gyrus. Supported by the behavioural results, our findings suggest that prior expectations are reinforced after repeated experience and learning, whereas unexpected emotions may rely on fast change detection processes. The results from the current study are compatible with the notion that the ability to automatically detect unexpected changes in complex dynamic environments allows for adaptive behaviours in potentially advantageous or threatening situations

    Auditory conflict and congruence in frontotemporal dementia.

    Get PDF
    Impaired analysis of signal conflict and congruence may contribute to diverse socio-emotional symptoms in frontotemporal dementias, however the underlying mechanisms have not been defined. Here we addressed this issue in patients with behavioural variant frontotemporal dementia (bvFTD; n = 19) and semantic dementia (SD; n = 10) relative to healthy older individuals (n = 20). We created auditory scenes in which semantic and emotional congruity of constituent sounds were independently probed; associated tasks controlled for auditory perceptual similarity, scene parsing and semantic competence. Neuroanatomical correlates of auditory congruity processing were assessed using voxel-based morphometry. Relative to healthy controls, both the bvFTD and SD groups had impaired semantic and emotional congruity processing (after taking auditory control task performance into account) and reduced affective integration of sounds into scenes. Grey matter correlates of auditory semantic congruity processing were identified in distributed regions encompassing prefrontal, parieto-temporal and insular areas and correlates of auditory emotional congruity in partly overlapping temporal, insular and striatal regions. Our findings suggest that decoding of auditory signal relatedness may probe a generic cognitive mechanism and neural architecture underpinning frontotemporal dementia syndromes

    Effects of affective and emotional congruency on facial expression processing under different task demands

    No full text
    Contextual influences on responses to facial expressions of emotion were studied using a context-target paradigm that allowed distinguishing the effects of affective congruency (context and target of same/different valence: positive or negative) and emotional congruency (context and target representing the same/different emotion: anger, fear, happiness). Sentences describing anger, fear or happiness-inducing events and faces expressing each of these emotions were used as contexts and targets, respectively. While between-valence comparisons (context and target of similar/different valence) revealed affective congruency effects, within-valence comparisons (context and target of similar valence and same/different emotion) revealed emotional congruency effects. In Experiment 1 no evidence of emotional congruency and limited evidence of affective congruency were found with an evaluative task. In Experiment 2 effects of both affective and emotional congruency were observed with an emotion recognition task. In this case, angry and fearful faces were recognized faster in emotionally congruent contexts. In Experiment 3 the participants were asked explicitly to judge the emotional congruency of the target faces. Emotional congruency effects were again found, with faster judgments of angry and fearful faces in the corresponding emotional contexts. Moreover, judgments of angry expressions were faster and more accurate in happy than in anger contexts. Thus, participants found easier to decide that angry faces did not match a happy context than to judge that they did match an anger context. These results suggest that there are differences in the way that facial expressions of positive and negative emotions are discriminated and integrated with their contexts. Specifically, compared to positive expressions, contextual integration of negative expressions seems to require a double check of the valence and the specific emotion category of the expression and the context.Spanish Ministerio de Ciencia e Innovación; (MINECO/FEDER); Comunidad Autónoma de Madridinfo:eu-repo/semantics/submittedVersio
    corecore