9 research outputs found

    The Role of Sensorimotor Processes in Pain Empathy

    No full text
    Pain is a salient, aversive sensation which motivates avoidance, but also has a strong social signaling function. Numerous studies have shown that regions of the nervous system active in association with first-hand pain are also active in response to the pain of others. When witnessing somatic pain, such as seeing bodies in painful situations, significant activations occur not only in areas related to the processing of negative emotions, but also in neuronal structures engaged in somatosensation and the control of skeletal muscles. These empathy-related sensorimotor activations are selectively reviewed in this article, with a focus on studies using electrophysiological methods and paradigms investigating responses to somatic pain. Convergent evidence from these studies shows that these activations (1) occur at multiple levels of the nervous system, from the spinal cord up to the cerebral cortex, (2) are best conceptualized as activations of a defensive system, in line with the role of pain to protect body from injury, and (3) contribute to establishing a matching of psychological states between the sufferer and the observer, which ultimately supports empathic understanding and motivate prosocial action. Future research should thus focus on how these sensorimotor responses are related to higher-order empathic responses, including affective sharing and emotion regulation, and how this motivates approach-related prosocial behaviors aimed at alleviating the pain and suffering of others.© The Author(s) 201

    Functionally analogous body- and animacy-responsive areas are present in the dog (Canis familiaris) and human occipito-temporal lobe

    No full text
    Comparing the neural correlates of socio-cognitive skills across species provides insights into the evolution of the social brain and has revealed face- and body-sensitive regions in the primate temporal lobe. Although from a different lineage, dogs share convergent visuo-cognitive skills with humans and a temporal lobe which evolved independently in carnivorans. We investigated the neural correlates of face and body perception in dogs (N = 15) and humans (N = 40) using functional MRI. Combining univariate and multivariate analysis approaches, we found functionally analogous occipito-temporal regions involved in the perception of animate entities and bodies in both species and face-sensitive regions in humans. Though unpredicted, we also observed neural representations of faces compared to inanimate objects, and dog compared to human bodies in dog olfactory regions. These findings shed light on the evolutionary foundations of human and dog social cognition and the predominant role of the temporal lobe

    Dogs Rely On Visual Cues Rather Than On Effector-Specific Movement Representations to Predict Human Action Targets

    No full text
    The ability to predict others\u27 actions is one of the main pillars of social cognition. We investigated the processes underlying this ability by pitting motor representations of the observed movements against visual familiarity. In two pre-registered eye-tracking experiments, we measured the gaze arrival times of 16 dogs (Canis familiaris) who observed videos of a human or a conspecific executing the same goal-directed actions. On the first trial, when the human agent performed human-typical movements outside dogs\u27 specific motor repertoire, dogs\u27 gaze arrived at the target object anticipatorily (i.e., before the human touched the target object). When the agent was a conspecific, dogs\u27 gaze arrived to the target object reactively (i.e., upon or after touch). When the human agent performed unusual movements more closely related to the dogs\u27 motor possibilities (e.g., crawling instead of walking), dogs\u27 gaze arrival times were intermediate between the other two conditions. In a replication experiment, with slightly different stimuli, dogs\u27 looks to the target object were neither significantly predictive nor reactive, irrespective of the agent. However, when including looks at the target object that were not preceded by looks to the agents, on average dogs looked anticipatorily and sooner at the human agent\u27s action target than at the conspecific\u27s. Looking times and pupil size analyses suggest that the dogs\u27 attention was captured more by the dog agent. These results suggest that visual familiarity with the observed action and saliency of the agent had a stronger influence on the dogs\u27 looking behaviour than effector-specific movement representations in anticipating action targets

    Do dogs preferentially encode the identity of the target object or the location of others\u27 actions?

    No full text
    The ability to make sense of and predict others\u27 actions is foundational for many socio-cognitive abilities. Dogs (Canis familiaris) constitute interesting comparative models for the study of action perception due to their marked sensitivity to human actions. We tested companion dogs (N = 21) in two screen-based eye-tracking experiments, adopting a task previously used with human infants and apes, to assess which aspects of an agent\u27s action dogs consider relevant to the agent\u27s underlying intentions. An agent was shown repeatedly acting upon the same one of two objects, positioned in the same location. We then presented the objects in swapped locations and the agent approached the objects centrally (Experiment 1) or the old object in the new location or the new object in the old location (Experiment 2). Dogs\u27 anticipatory fixations and looking times did not reflect an expectation that agents should have continued approaching the same object nor the same location as witnessed during the brief familiarization phase; this contrasts with some findings with infants and apes, but aligns with findings in younger infants before they have sufficient motor experience with the observed action. However, dogs\u27 pupil dilation and latency to make an anticipatory fixation suggested that, if anything, dogs expected the agents to keep approaching the same location rather than the same object, and their looking times showed sensitivity to the animacy of the agents. We conclude that dogs, lacking motor experience with the observed actions of grasping or kicking performed by a human or inanimate agent, might interpret such actions as directed toward a specific location rather than a specific object. Future research will need to further probe the suitability of anticipatory looking as measure of dogs\u27 socio-cognitive abilities given differences between the visual systems of dogs and primates

    The Human Factor: Behavioral and Neural Correlates of Humanized Perception in Moral Decision Making

    No full text
    The extent to which people regard others as full-blown individuals with mental states (“humanization”) seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim’s perceived humanness and an ensuing increase in vicarious emotions and emotional conflict during decision making. Using fMRI, we assessed neural activity underlying moral decisions that affected fictitious persons that had or had not been experimentally humanized. In implicit priming trials, participants either engaged in mentalizing about these persons (Humanized condition) or not (Neutral condition). In subsequent moral dilemmas, participants had to decide about sacrificing these persons’ lives in order to save the lives of numerous others. Humanized persons were sacrificed less often, and the activation pattern during decisions about them indicated increased negative affect, emotional conflict, vicarious emotions, and behavioral control (pgACC/mOFC, anterior insula/IFG, aMCC and precuneus/PCC). Besides, we found enhanced effective connectivity between aMCC and anterior insula, which suggests increased emotion regulation during decisions affecting humanized victims. These findings highlight the importance of others’ perceived humanness for prosocial behavior - with aversive affect and other-related concern when imagining harming more “human-like” persons acting against purely utilitarian decisions.© Majdandžić et a

    Event-related potentials of automatic imitation are modulated by ethnicity during stimulus processing, but not during motor execution

    No full text
    This study investigated neural processes underlying automatic imitation and its modulation by ethnically diverse hand stimuli (Black, White) using event-related brain potentials (ERPs). Automatic imitation relies on motor stimulus-response compatibility (SRC), i.e., response conflict caused by motoric (in)congruency between task-irrelevant hand stimuli and the required response. Our novel task aimed to separate two distinct neuro-cognitive processing stages of automatic imitation and its modulation by ethnicity: the stage of stimulus processing (i.e. perception), comprising presentation of stimulus ethnicity and SRC, and the stage of response execution (i.e. action). Effects of ethnicity were observed in ERPs of different stages of stimulus processing - during presentation of ethnicity (LPP) and SRC (N190, P3). ERPs at response execution, Pre-Motion Positivity (PMP) and Reafferent Potential (RAP), were only sensitive to congruency. The N190 results may index visual self-other distinction, while the neural timecourse of P3 and PMP variation could reflect a dynamical decision process linking perception to action, with motor initiation reflected in the PMP component. The PMP might further index motor-related self-other distinction regardless of ethnicity. Importantly, overt motor execution was not influenced by ethnically diverse stimuli, which suggests generalizability of the automatic imitation effect across ethnicities.© The Author(s) 201

    Emotional Egocentricity Bias Across the Life-Span

    No full text
    In our daily lives, we often have to quickly estimate the emotions of our conspecifics in order to have successful social interactions. While this estimation process seems quite easy when we are ourselves in a neutral or equivalent emotional state, it has recently been shown that in case of incongruent emotional states between ourselves and the others, our judgments can be biased. This phenomenon, introduced to the literature with the term Emotional Egocentricity Bias (EEB), has been found to occur in young adults and, to a greater extent, in children. However, how the EEB changes across the life-span from adolescence to old age has been largely unexplored. In this study, we recruited 114 female participants subdivided in four cohorts (adolescents, young adults, middle-aged adults, older adults) to examine EEB age-related changes. Participants were administered with a recently developed paradigm which, by making use of visuo-tactile stimulation that elicits conflicting feelings in paired participants, allows the valid and reliable exploration of the EEB. Results highlighted a U-shape relation between age and EEB, revealing enhanced emotional egocentricity in adolescents and older adults compared to young and middle-aged adults. These results are in line with the neuroscientific literature which has recently shown that overcoming the EEB is associated with a greater activation of a portion of the parietal lobe, namely the right Supramarginal Gyrus (rSMG). This is an area that reaches full maturation by the end of adolescence and goes through an early decay. Thus, the age-related changes of the EEB could be possibly due to the life-span development of the rSMG. This study is the first one to show the quadratic relation between age and the EEB and set a milestone for further research exploring the neural correlates of the life-span development of the EEB. Future studies are needed in order to generalize these results to the male population and to explore gender differences related to the aging of socio- emotional processes

    Neurobiological differences in mental rotation and instrument interpretation in airline pilots

    No full text
    Airline pilots and similar professions require reliable spatial cognition abilities, such as mental imagery of static and moving three-dimensional objects in space. A well-known task to investigate these skills is the Shepard and Metzler mental rotation task (SMT), which is also frequently used during pre-assessment of pilot candidates. Despite the intuitive relationship between real-life spatial cognition and SMT, several studies have challenged its predictive value. Here we report on a novel instrument interpretation task (IIT) based on a realistic attitude indicator used in modern aircrafts that was designed to bridge the gap between the abstract SMT and a cockpit environment. We investigated 18 professional airline pilots using fMRI. No significant correlation was found between SMT and IIT task accuracies. Contrasting both tasks revealed higher activation in the fusiform gyrus, angular gyrus, and medial precuneus for IIT, whereas SMT elicited significantly stronger activation in pre- and supplementary motor areas, as well as lateral precuneus and superior parietal lobe. Our results show that SMT skills per se are not sufficient to predict task accuracy during (close to) real-life instrument interpretation. While there is a substantial overlap of activation across the task conditions, we found that there are important differences between instrument interpretation and non-aviation based mental rotation

    The effect of sleep restriction on empathy for pain: An fMRI study in younger and older adults

    No full text
    Age and sleep both affect emotional functioning. Since sleep patterns change over the lifespan, we investigated the effects of short sleep and age on empathic responses. In a randomized cross-over experimental design, healthy young and older volunteers (n = 47 aged 20–30 years and n = 39 aged 65–75 years) underwent functional magnetic resonance imaging (fMRI) after normal sleep or night sleep restricted to 3 hours. During fMRI, participants viewed pictures of needles pricking a hand (pain) or Q-tips touching a hand (control), a well-established paradigm to investigate empathy for pain. There was no main effect of sleep restriction on empathy. However, age and sleep interacted so that sleep restriction caused increased unpleasantness in older but not in young participants. Irrespective of sleep condition, older participants showed increased activity in angular gyrus, superior temporal sulcus and temporo-parietal junction compared to young. Speculatively, this could indicate that the older individuals adopted a more cognitive approach in response to others’ pain. Our findings suggest that caution in generalizability across age groups is needed in further studies of sleep on social cognition and emotion.© The Author(s) 201
    corecore