26 research outputs found

    Reading others' emotions: Evidence from event-related potentials

    Get PDF
    This Thesis aimed at investigating, by using the event-related potentials (ERPs) technique, some relevant aspects involved in human ability to read others’ emotions and in empathizing with others’ affective states. Social and affective neuroscience has largely studied faces and facial expressions since they represent relevant “pieces of information” in guiding individuals during interaction. Their importance is strictly related to the fact that they provide unique information about identity, gender, age, trustworthiness, and attractiveness, but they also convey emotions. In Chapter 1, I have introduced the reader to the contents of this Thesis, in particular the ability to “read” others’ facial expressions and to empathize with others’ affective states. In Chapter 2, I have offered an overview of knowledge available today on how humans process faces in general and facial expressions in particular. I have proposed a theoretical excursus starting from Bruce and Young’s cognitive model (1986) to a recent simulative model of recognition of emotional facial expressions by Wood and colleagues (2016), which considers facial mimicry helpful in discriminating between subtle emotions. In Chapter 3 and 4, I have presented two different studies (Experiments 1 and 2, respectively) strongly related to each other, since they aimed both at testing a functional link between the visual system and facial mimicry/sensorimotor simulation during the processing of facial expressions of emotions. I have described two different studies in which ERPs, by virtue of its high temporal resolution, allowed to track the time-course of the hypothesized influence of mimicry/simulation on the stages of visual analysis of facial expressions. The aim of Experiment 1 was to explore the potential connection between facial mimicry and the early stage of the construction of visual percepts of facial expressions; while the Experiment 2 investigated whether and how facial mimicry could interact with later stages of visual processing focusing on the construction of visual working memory representations of facial expressions of emotions, by also monitoring whether this process could depend on the degree of the observers’ empathy. For both studies, the results strongly suggest that mimicry may influence early and later stages of visual processing of faces and facial expressions. In the second part of my Thesis, I introduced the reader to the construct of empathy, dealing with its multifaceted nature and the role of different factors in the modulation of an empathic response, especially to others’ pain (Chapter 5). In Chapter 6 and 7, I have discussed two ERP studies (Experiments 3 and 4a) with one behavioral study included as a control study (Experiment 4b) to investigate the empathic reaction to others’ pain as a function of different variables that could play a role in daily life. Experiment 3 investigated the role of prosodic information in neural empathic responses to others’ pain. Results from this study demonstrated that prosodic information can enhance human ability to share others’ pain by acting transversely on the two main empathy components, the experience sharing and the mentalizing. The aim of Experiment 4a was to study whether the physical distance between an observer and an individual in a particular affective state, induced by a painful stimulation, is a critical factor in modulating the magnitude of an empathic neural reaction in the observer. Thus, by manipulating the perceived physical distance of face stimuli, I observed a moderating effect on empathic ERP reactions as a function of the perceived physical distance of faces. Results of Experiment 4b clarified that the critical factor triggering differential empathic reactions in the two groups in Experiment 4a was not related to the likelihood of identifying the faces of the two sizes but to the perceived physical distance. Finally, in Chapter 8, a general discussion highlights the main findings presented in this Thesis, by also providing future suggestions to extend the research on this topics debated in the previous Chapters

    Out of sight out of mind: Perceived physical distance between the observer and someone in pain shapes observer's neural empathic reactions

    Get PDF
    Social and affective relations may shape empathy to others' affective states. Previous studies also revealed that people tend to form very different mental representations of stimuli on the basis of their physical distance. In this regard, embodied cognition and embodied simulation propose that different physical distances between individuals activate different interpersonal processing modes, such that close physical distance tends to activate the interpersonal processing mode typical of socially and affectively close relationships. In Experiment 1, two groups of participants were administered a pain decision task involving upright and inverted face stimuli painfully or neutrally stimulated, and we monitored their neural empathic reactions by means of event-related potentials (ERPs) technique. Crucially, participants were presented with face stimuli of one of two possible sizes in order to manipulate retinal size and perceived physical distance, roughly corresponding to the close and far portions of social distance. ERPs modulations compatible with an empathic reaction were observed only for the group exposed to face stimuli appearing to be at a close social distance from the participants. This reaction was absent in the group exposed to smaller stimuli corresponding to face stimuli observed from a far social distance. In Experiment 2, one different group of participants was engaged in a match-to-sample task involving the two-size upright face stimuli of Experiment 1 to test whether the modulation of neural empathic reaction observed in Experiment 1 could be ascribable to differences in the ability to identify faces of the two different sizes. Results suggested that face stimuli of the two sizes could be equally identifiable. In line with the Construal Level and Embodied Simulation theoretical frameworks, we conclude that perceived physical distance may shape empathy as well as social and affective distance

    Reading others' emotions: Evidence from event-related potentials

    Get PDF
    This Thesis aimed at investigating, by using the event-related potentials (ERPs) technique, some relevant aspects involved in human ability to read others’ emotions and in empathizing with others’ affective states. Social and affective neuroscience has largely studied faces and facial expressions since they represent relevant “pieces of information” in guiding individuals during interaction. Their importance is strictly related to the fact that they provide unique information about identity, gender, age, trustworthiness, and attractiveness, but they also convey emotions. In Chapter 1, I have introduced the reader to the contents of this Thesis, in particular the ability to “read” others’ facial expressions and to empathize with others’ affective states. In Chapter 2, I have offered an overview of knowledge available today on how humans process faces in general and facial expressions in particular. I have proposed a theoretical excursus starting from Bruce and Young’s cognitive model (1986) to a recent simulative model of recognition of emotional facial expressions by Wood and colleagues (2016), which considers facial mimicry helpful in discriminating between subtle emotions. In Chapter 3 and 4, I have presented two different studies (Experiments 1 and 2, respectively) strongly related to each other, since they aimed both at testing a functional link between the visual system and facial mimicry/sensorimotor simulation during the processing of facial expressions of emotions. I have described two different studies in which ERPs, by virtue of its high temporal resolution, allowed to track the time-course of the hypothesized influence of mimicry/simulation on the stages of visual analysis of facial expressions. The aim of Experiment 1 was to explore the potential connection between facial mimicry and the early stage of the construction of visual percepts of facial expressions; while the Experiment 2 investigated whether and how facial mimicry could interact with later stages of visual processing focusing on the construction of visual working memory representations of facial expressions of emotions, by also monitoring whether this process could depend on the degree of the observers’ empathy. For both studies, the results strongly suggest that mimicry may influence early and later stages of visual processing of faces and facial expressions. In the second part of my Thesis, I introduced the reader to the construct of empathy, dealing with its multifaceted nature and the role of different factors in the modulation of an empathic response, especially to others’ pain (Chapter 5). In Chapter 6 and 7, I have discussed two ERP studies (Experiments 3 and 4a) with one behavioral study included as a control study (Experiment 4b) to investigate the empathic reaction to others’ pain as a function of different variables that could play a role in daily life. Experiment 3 investigated the role of prosodic information in neural empathic responses to others’ pain. Results from this study demonstrated that prosodic information can enhance human ability to share others’ pain by acting transversely on the two main empathy components, the experience sharing and the mentalizing. The aim of Experiment 4a was to study whether the physical distance between an observer and an individual in a particular affective state, induced by a painful stimulation, is a critical factor in modulating the magnitude of an empathic neural reaction in the observer. Thus, by manipulating the perceived physical distance of face stimuli, I observed a moderating effect on empathic ERP reactions as a function of the perceived physical distance of faces. Results of Experiment 4b clarified that the critical factor triggering differential empathic reactions in the two groups in Experiment 4a was not related to the likelihood of identifying the faces of the two sizes but to the perceived physical distance. Finally, in Chapter 8, a general discussion highlights the main findings presented in this Thesis, by also providing future suggestions to extend the research on this topics debated in the previous Chapters.Questo elaborato ha l’obiettivo di indagare, tramite l’utilizzo della tecnica dei potenziali evento-relati (ERPs, Event-Related Potentials), alcuni aspetti che caratterizzano e guidano l’interazione sociale umana, come l’abilitĂ  di leggere e comprendere le emozioni altrui. Le neuroscienze sociali hanno studiato nel dettaglio volti ed espressioni facciali, in quanto stimoli che, oltre a fornire informazioni uniche circa l’identitĂ , il genere, l’etĂ , l’affidabilitĂ , l’attrattivitĂ  e la direzione dello sguardo, tramettono indicazioni circa gli stati emotivi dell’altro. Il Capitolo 1 fornisce una panoramica teorica, primum, rispetto al processamento dei volti e delle espressioni facciali, deinde, sull’empatia, in particolare al dolore, intesa come capacitĂ  umana di comprendere l’altrui stato affettivo. Nel Capitolo 2 Ăš proposto un excursus teorico sul processamento dei volti e delle emozioni da essi veicolate, partendo dal modello cognitivo di Bruce e Young (1986) ai recenti modelli simulativi, fino a quello piĂč attuale di Wood e colleghi (2016), che considera il ruolo della mimica facciale nella discriminazione di emozioni sottili. Nei Capitoli 3 e 4, sono presentati due studi strettamente interconnessi (rispettivamente, Esperimento 1 e 2). Entrambi hanno come obiettivo lo studio di un collegamento funzionale tra il sistema visivo e la mimica facciale/simulazione senso-motoria, nel processamento di emozioni tramite l’osservazione di espressioni facciali. Nei due studi Ăš stata utilizzata la tecnica degli ERPs che, data la sua alta risoluzione temporale, ha permesso di tracciare una dinamica temporale chiarendo il ruolo della mimica/simulazione sugli stadi di analisi visiva coinvolti nell’elaborazione di espressioni facciali. L’obiettivo dell’Esperimento 1 era di indagare una possibile connessione tra la mimica facciale e uno dei primi stadi di costruzione del percetto visivo del volto; mentre l’Esperimento 2, indagava se e come la mimica facciale interagisse con uno stadio piĂč tardivo legato alla costruzione di una rappresentazione in memoria di lavoro visiva e se questo processo dipendesse dal grado di empatia dell’osservatore. I risultati dei due Esperimenti suggeriscono come la mimica facciale influenzi sia gli stadi precoci che tardivi del processamento di emozioni tramite l’osservazione di espressioni facciali. Nella seconda parte della Tesi, viene affrontato il tema dell’empatia, con particolare riferimento alla sua natura sfaccettata e al come variabili diverse possano modulare la risposta empatica stessa, specialmente al dolore altrui (Capitolo 5). All’interno dei Capitoli 6 e 7 sono presentati due studi ERPs (Esperimento 3 e 4a) e un’indagine comportamentale (Esperimento 4b) con l’obiettivo di indagare la risposta empatica, elicitata nell’osservatore, quando si trova di fronte a qualcuno che sta provando dolore. L’esperimento 3 vuole studiare il ruolo della prosodia nel modulare la risposta neurale empatica nell’osservatore. I risultati dimostrano che l’informazione prosodica puĂČ aumentare la risposta empatica, agendo trasversalmente sulle due grandi componenti dell’empatia, experience sharing e mentalizing. Nell’Esperimento 4a, l’obiettivo era di comprendere se la distanza fisica tra l’osservatore e un individuo in una situazione dolorosa, potesse rappresentare un fattore importante nel modulare la grandezza della risposta empatica. Questo studio, attraverso la manipolazione della distanza fisica percepita di volti, ha mostrato una riduzione della risposta empatica rilevata nell’osservatore, in funzione della distanza fisica percepita. Il risultato dell’Esperimento 4b, invece, ha chiarito che il fattore critico nella generazione della risposta empatica (studio 4a) fosse la distanza fisica percepita e non quanto fossero discriminabili i volti tra loro. In conclusione, nel Capitolo 8, Ăš fornita una discussione generale che integri i risultati piĂč importanti ottenuti negli studi descritti, cercando di delineare risvolti e prospettive future

    “Mind the plexiglass” - An invisible barrier is enough to reduce neural empathic responses to pain and touch

    No full text
    Empathy might be shaped by socio-affective relationships between individuals, such that neural empathic reactions are magnified for affectively close others compared to strangers. In a recent investigation, we demonstrated that the perceived physical distance between actors could shape people’s empathic reactions towards a person in pain. However, the underlying mechanism through which manipulating the physical distance may interfere with empathic responses it is still to be understood. One such possible mechanism refers to the notion of ‘interaction space’, the shared reaching space of two individuals. Within this framework, the sharing of affective states might be sensitive to the physical distance between individuals. This study aimed investigating, by using the ERP technique, whether the neural empathic reactions for observed faces, either gently touched or painfully stimulated, perceived within the interaction space could be modulated by the presence of a transparent physical barrier, which prevents the possibility to interact with the partner, without altering neither the quality nor the low-level characteristic of the observed stimuli. We designed an ERP study in which participants were exposed to faces stimulated by either a needle or a Q-tip under both one condition in which they directly seated in front of the screen (no- plexiglass condition) and a second critical condition in which a transparent plexiglass was interposed between them and the screen (plexiglass condition). We expected to observe a moderating effect on ERP empathic reactions as a function of the presence of the barrier, such that when participants were under “plexiglass condition”, they would show a lower magnitude of the neural empathic reactions. Further, given the plexiglass prevents the possibility of reaching the other person, we hypothesized that the plexiglass barrier would have impacted on the empathic reactions, regardless of the type of stimulation observed. Results confirmed our hypothesis showing that the presence of a physical barrier decreased the ERP amplitude, suggesting a reduction of the observer’s neural emphatic reaction

    Neural measures of the causal role of observers' facial mimicry on visual working memory for facial expressions

    No full text
    Simulation models of facial expressions propose that sensorimotor regions may increase the clarity of facial expressions representations in extrastriate areas. We monitored the event-related potential marker of visual working memory (VWM) representations, namely the sustained posterior contralateral negativity (SPCN), also termed contralateral delay activity, while participants performed a change detection task including to-be-memorized faces with different intensities of anger. In one condition participants could freely use their facial mimicry during the encoding/VWM maintenance of the faces while in a different condition participants had their facial mimicry blocked by a gel. Notably, SPCN amplitude was reduced for faces in the blocked mimicry condition when compared to the free mimicry condition. This modulation interacted with the empathy levels of participants such that only participants with medium-high empathy scores showed such reduction of the SPCN amplitude when their mimicry was blocked. The SPCN amplitude was larger for full expressions when compared to neutral and subtle expressions, while subtle expressions elicited lower SPCN amplitudes than neutral faces. These findings provide evidence of a functional link between mimicry and VWM for faces and further shed light on how this memory system may receive feedbacks from sensorimotor regions during the processing of facial expressions

    Intra-individual behavioural and neural signatures of audience effects and interactions in a mirror-game paradigm

    No full text
    We often perform actions while observed by others, yet the behavioural and neural signatures of audience effects remain understudied. Performing actions while being observed has been shown to result in more emphasized movements in musicians and dancers, as well as during communicative actions. Here, we investigate the behavioural and neural mechanisms of observed actions in relation to individual actions in isolation and interactive joint actions. Movement kinematics and EEG were recorded in 42 participants (21 pairs) during a mirror-game paradigm, while participants produced improvised movements alone, while observed by a partner, or by synchronizing movements with the partner. Participants produced largest movements when being observed, and observed actors and dyads in interaction produced slower and less variable movements in contrast with acting alone. On a neural level, we observed increased mu suppression during interaction, as well as to a lesser extent during observed actions, relative to individual actions. Moreover, we observed increased widespread functional brain connectivity during observed actions relative to both individual and interactive actions, suggesting increased intra-individual monitoring and action-perception integration as a result of audience effects. These results suggest that observed actors take observers into account in their action plans by increasing self-monitoring; on a behavioural level, observed actions are similar to emergent interactive actions, characterized by slower and more predictable movements

    Altering sensorimotor simulation impacts early stages of facial expression processing depending on individual differences in alexithymic traits

    No full text
    Simulation models of facial expressions suggest that posterior visual areas and brain areas underpinning sensorimotor simulations might interact to improve facial expression processing. According to these models, facial mimicry, a manifestation of sensorimotor simulation, may contribute to the visual processing of facial expressions by influencing early stages. The aim of this study was to assess whether and how sensorimotor simulation influences early stages of face processing, also investigating its relationship with alexithymic traits given that previous studies have suggested that individuals with high levels of alexithymic traits (vs. individuals with low levels of alexithymic traits) tend to use sensorimotor simulation to a lesser extent. We monitored P1 and N170 ERP components of the event-related potentials (ERP) in participants performing a fine discrimination task of facial expressions and animals, as a control condition. In half of the experiment, participants could freely use their facial mimicry whereas in the other half they had their facial mimicry blocked by a gel. Our results revealed that only individuals with lower compared to high alexithymic traits showed a larger modulation of the P1 amplitude as a function of the mimicry manipulation selectively for facial expressions (but not for animals), while we did not observe any modulation of the N170. Given the null results at the behavioural level, we interpreted the P1 modulation as compensative visual processing in individuals with low levels of alexithymia under conditions of interference on the sensorimotor processing, providing a preliminary evidence in favor of sensorimotor simulation models

    Altering sensorimotor simulation impacts early stages of facial expression processing depending on individual differences in alexithymic traits

    No full text
    Simulation models of facial expressions suggest that posterior visual areas and brain areas underpinning sensorimotor simulations might interact to improve facial expression processing. According to these models, facial mimicry, a manifestation of sensorimotor simulation, may contribute to the visual processing of facial expressions by influencing early stages. The aim of this study was to assess whether and how sensorimotor simulation influences early stages of face processing, also investigating its relationship with alexithymic traits given that previous studies have suggested that individuals with high levels of alexithymic traits (vs. individuals with low levels of alexithymic traits) tend to use sensorimotor simulation to a lesser extent. We monitored P1 and N170 ERP components of the event-related potentials (ERP) in participants performing a fine discrimination task of facial expressions and animals, as a control condition. In half of the experiment, participants could freely use their facial mimicry whereas in the other half they had their facial mimicry blocked by a gel. Our results revealed that only individuals with lower compared to high alexithymic traits showed a larger modulation of the P1 amplitude as a function of the mimicry manipulation selectively for facial expressions (but not for animals), while we did not observe any modulation of the N170. Given the null results at the behavioural level, we interpreted the P1 modulation as compensative visual processing in individuals with low levels of alexithymia under conditions of interference on the sensorimotor processing, providing a preliminary evidence in favor of sensorimotor simulation models

    Shared Attention Amplifies the Neural Processing of Emotional Faces

    No full text
    Sharing an experience, without communicating, affects people's subjective perception of the experience, often by intensifying it. We investigated the neural mechanisms underlying shared attention by implementing an EEG study where participants attended to and rated the intensity of emotional faces, simultaneously or independently. Participants performed the task in three experimental conditions: (a) alone; (b) simultaneously next to each other in pairs, without receiving feedback of the other's responses (shared without feedback); and (c) simultaneously while receiving the feedback (shared with feedback). We focused on two face-sensitive ERP components: The amplitude of the N170 was greater in the "shared with feedback" condition compared to the alone condition, reflecting a top-down effect of shared attention on the structural encoding of faces, whereas the EPN was greater in both shared context conditions compared to the alone condition, reflecting an enhanced attention allocation in the processing of emotional content of faces, modulated by the social context. Taken together, these results suggest that shared attention amplifies the neural processing of faces, regardless of the valence of facial expressions
    corecore