96,125 research outputs found

    Gaze cuing of attention in snake phobic women: the influence of facial expression

    Get PDF
    Only a few studies investigated whether animal phobics exhibit attentional biases in contexts where no phobic stimuli are present. Among these, recent studies provided evidence for a bias toward facial expressions of fear and disgust in animal phobics. Such findings may be due to the fact that these expressions could signal the presence of a phobic object in the surroundings. To test this hypothesis and further investigate attentional biases for emotional faces in animal phobics, we conducted an experiment using a gaze-cuing paradigm in which participants\u2019 attention was driven by the task-irrelevant gaze of a centrally presented face. We employed dynamic negative facial expressions of disgust, fear and anger and found an enhanced gaze-cuing effect in snake phobics as compared to controls, irrespective of facial expression. These results provide evidence of a general hypervigilance in animal phobics in the absence of phobic stimuli, and indicate that research on specific phobias should not be limited to symptom provocation paradigms

    Depression-related difficulties disengaging from negative faces are associated with sustained attention to negative feedback during social evaluation and predict stress recovery

    Get PDF
    The present study aimed to clarify: 1) the presence of depression-related attention bias related to a social stressor, 2) its association with depression-related attention biases as measured under standard conditions, and 3) their association with impaired stress recovery in depression. A sample of 39 participants reporting a broad range of depression levels completed a standard eye-tracking paradigm in which they had to engage/disengage their gaze with/from emotional faces. Participants then underwent a stress induction (i.e., giving a speech), in which their eye movements to false emotional feedback were measured, and stress reactivity and recovery were assessed. Depression level was associated with longer times to engage/disengage attention with/from negative faces under standard conditions and with sustained attention to negative feedback during the speech. These depression-related biases were associated and mediated the association between depression level and self-reported stress recovery, predicting lower recovery from stress after giving the speech

    Variation in normal mood state influences sensitivity to dynamic changes in emotional expression

    Get PDF
    Acknowlegements We would like to thank Dr Douglas Martin for providing useful comments on an earlier draft. Thanks also to Kostadin Karavasilev for help with some of the data collection.Peer reviewedPostprin

    How major depressive disorder affects the ability to decode multimodal dynamic emotional stimuli

    Get PDF
    Most studies investigating the processing of emotions in depressed patients reported impairments in the decoding of negative emotions. However, these studies adopted static stimuli (mostly stereotypical facial expressions corresponding to basic emotions) which do not reflect the way people experience emotions in everyday life. For this reason, this work proposes to investigate the decoding of emotional expressions in patients affected by Recurrent Major Depressive Disorder (RMDDs) using dynamic audio/video stimuli. RMDDs’ performance is compared with the performance of patients with Adjustment Disorder with Depressed Mood (ADs) and healthy (HCs) subjects. The experiments involve 27 RMDDs (16 with acute depression - RMDD-A, and 11 in a compensation phase - RMDD-C), 16 ADs and 16 HCs. The ability to decode emotional expressions is assessed through an emotion recognition task based on short audio (without video), video (without audio) and audio/video clips. The results show that AD patients are significantly less accurate than HCs in decoding fear, anger, happiness, surprise and sadness. RMDD-As with acute depression are significantly less accurate than HCs in decoding happiness, sadness and surprise. Finally, no significant differences were found between HCs and RMDD-Cs in a compensation phase. The different communication channels and the types of emotion play a significant role in limiting the decoding accuracy

    Brief mindfulness training enhances cognitive control in socioemotional contexts: Behavioral and neural evidence.

    Get PDF
    In social contexts, the dynamic nature of others' emotions places unique demands on attention and emotion regulation. Mindfulness, characterized by heightened and receptive moment-to-moment attending, may be well-suited to meet these demands. In particular, mindfulness may support more effective cognitive control in social situations via efficient deployment of top-down attention. To test this, a randomized controlled study examined effects of mindfulness training (MT) on behavioral and neural (event-related potentials [ERPs]) responses during an emotional go/no-go task that tested cognitive control in the context of emotional facial expressions that tend to elicit approach or avoidance behavior. Participants (N = 66) were randomly assigned to four brief (20 min) MT sessions or to structurally equivalent book learning control sessions. Relative to the control group, MT led to improved discrimination of facial expressions, as indexed by d-prime, as well as more efficient cognitive control, as indexed by response time and accuracy, and particularly for those evidencing poorer discrimination and cognitive control at baseline. MT also produced better conflict monitoring of behavioral goal-prepotent response tendencies, as indexed by larger No-Go N200 ERP amplitudes, and particularly so for those with smaller No-Go amplitude at baseline. Overall, findings are consistent with MT's potential to enhance deployment of early top-down attention to better meet the unique cognitive and emotional demands of socioemotional contexts, particularly for those with greater opportunity for change. Findings also suggest that early top-down attention deployment could be a cognitive mechanism correspondent to the present-oriented attention commonly used to explain regulatory benefits of mindfulness more broadly

    Facial expression aftereffect revealed by adaption to emotion-invisible dynamic bubbled faces

    Get PDF
    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation

    Interactions between visceral afferent signaling and stimulus processing

    Get PDF
    Visceral afferent signals to the brain influence thoughts, feelings and behaviour. Here we highlight the findings of a set of empirical investigations in humans concerning body-mind interaction that focus on how feedback from states of autonomic arousal shapes cognition and emotion. There is a longstanding debate regarding the contribution of the body, to mental processes. Recent theoretical models broadly acknowledge the role of (autonomically mediated) physiological arousal to emotional, social and motivational behaviours, yet the underlying mechanisms are only partially characterized. Neuroimaging is overcoming this shortfall; first, by demonstrating correlations between autonomic change and discrete patterns of evoked, and task- independent, neural activity; second, by mapping the central consequences of clinical perturbations in autonomic response and; third, by probing how dynamic fluctuations in peripheral autonomic state are integrated with perceptual, cognitive and emotional processes. Building on the notion that an important source of the brain’s representation of physiological arousal is derived from afferent information from arterial baroreceptors, we have exploited the phasic nature of these signals to show their differential contribution to the processing of emotionally-salient stimuli. This recent work highlights the facilitation at neural and behavioral levels of fear and threat processing that contrasts with the more established observations of the inhibition of central pain processing during baroreceptors activation. The implications of this body-brain-mind axis are discussed

    The Curvilinear Relationship between Age and Emotional Aperture : The Moderating Role of Agreeableness

    Get PDF
    The capability to correctly recognize collective emotion expressions (i.e., emotional aperture) is crucial for effective social and work-related interactions. Yet, little remains known about the antecedents of this ability. The present study therefore aims to shed new light onto key aspects that may promote or diminish an individual’s emotional aperture. We examine the role of age for this ability in an online sample of 181 participants (with an age range of 18 to 72 years, located in Germany), and we investigate agreeableness as a key contingency factor. Among individuals with lower agreeableness, on the one hand, our results indicate a curvilinear relationship between age and emotional aperture, such that emotional aperture remains at a relatively high level until these individuals’ middle adulthood (with a slight increase until their late 30s) and declines afterwards. Individuals with higher agreeableness, on the other hand, exhibit relatively high emotional aperture irrespective of their age. Together, these findings offer new insights for the emerging literature on emotional aperture, illustrating that specific demographic and personality characteristics may jointly shape such collective emotion recognition

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents
    • 

    corecore