35 research outputs found
Neural foundations of cooperative social interactions
The embodied-embedded-enactive-extended (4E) approach to study cognition suggests that interaction with the world is a crucial component of our cognitive processes. Most of our time, we interact with other people. Therefore, studying cognition without interaction is incomplete. Until recently, social neuroscience has only focused on studying isolated human and animal brains, leaving interaction unexplored. To fill this gap, we studied interacting participants, focusing on both intra- and inter-brain (hyperscanning) neural activity. In the first study, we invited dyads to perform a visual task in both a cooperative and a competitive context while we measured EEG. We found that mid-frontal activity around 200-300 ms after receiving monetary rewards was sensitive to social context and differed between cooperative and competitive situations. In the second study, we asked participants to coordinate their movements with each other and with a robotic partner. We found significantly stronger EEG amplitudes at frontocentral electrodes when people interacted with a robotic partner. Lastly, we performed a comprehensive literature review and the first meta-analysis in the emerging field of hyperscanning that validated it as a method to study social interaction. Taken together, our results showed that adding a second participant (human or AI/robotic) fostered our understanding of human cognition. We learned that the activity at frontocentral electrodes is sensitive to social context and type of partner (human or robotic). In both studies, the participants’ interaction was required to show these novel neural processes involved in action monitoring. Similarly, studying inter-brain neural activity allows for the exploration of new aspects of cognition. Many cognitive functions involved in successful social interactions are accompanied by neural synchrony between brains, suggesting the extended form of our cognition
Emotions, behaviour and belief regulation in an intelligent guide with attitude
Abstract unavailable please refer to PD
The impact of voice on trust attributions
Trust and speech are both essential aspects of human interaction. On the one hand, trust
is necessary for vocal communication to be meaningful. On the other hand, humans have
developed a way to infer someone’s trustworthiness from their voice, as well as to signal their
own. Yet, research on trustworthiness attributions to speakers is scarce and contradictory,
and very often uses explicit data, which do not predict actual trusting behaviour. However,
measuring behaviour is very important to have an actual representation of trust. This thesis
contains 5 experiments aimed at examining the influence of various voice characteristics —
including accent, prosody, emotional expression and naturalness — on trusting behaviours
towards virtual players and robots. The experiments have the "investment game"—a method
derived from game theory, which allows to measure implicit trustworthiness attributions over
time — as their main methodology. Results show that standard accents, high pitch, slow
articulation rate and smiling voice generally increase trusting behaviours towards a virtual
agent, and a synthetic voice generally elicits higher trustworthiness judgments towards
a robot. The findings also suggest that different voice characteristics influence trusting
behaviours with different temporal dynamics. Furthermore, the actual behaviour of the
various speaking agents was modified to be more or less trustworthy, and results show
that people’s trusting behaviours develop over time accordingly. Also, people reinforce
their trust towards speakers that they deem particularly trustworthy when these speakers
are indeed trustworthy, but punish them when they are not. This suggests that people’s
trusting behaviours might also be influenced by the congruency of their first impressions
with the actual experience of the speaker’s trustworthiness — a "congruency effect". This
has important implications in the context of Human–Machine Interaction, for example for
assessing users’ reactions to speaking machines which might not always function properly.
Taken together, the results suggest that voice influences trusting behaviour, and that first
impressions of a speaker’s trustworthiness based on vocal cues might not be indicative of
future trusting behaviours, and that trust should be measured dynamically
The end of stigma? Understanding the dynamics of legitimisation in the context of TV series consumption
This research contributes to prior work on stigmatisation by looking at stigmatisation and legitimisation as social processes in the context of TV series consumption. Using in-depth interviews, we show that the dynamics of legitimisation are complex and accompanied by the reproduction of existing stigmas and creation of new stigmas
The Simulation of Smiles (SIMS) model: Embodied simulation and the meaning of facial expression
Recent application of theories of embodied or grounded cognition to the recognition and interpretation of facial expression of emotion has led to an explosion of research in psychology and the neurosciences. However, despite the accelerating number of reported findings, it remains unclear how the many component processes of emotion and their neural mechanisms actually support embodied simulation. Equally unclear is what triggers the use of embodied simulation versus perceptual or conceptual strategies in determining meaning. The present article integrates behavioral research from social psychology with recent research in neurosciences in order to provide coherence to the extant and future research on this topic. The roles of several of the brain's reward systems, and the amygdala, somatosensory cortices, and motor centers are examined. These are then linked to behavioral and brain research on facial mimicry and eye gaze. Articulation of the mediators and moderators of facial mimicry and gaze are particularly useful in guiding interpretation of relevant findings from neurosciences. Finally, a model of the processing of the smile, the most complex of the facial expressions, is presented as a means to illustrate how to advance the application of theories of embodied cognition in the study of facial expression of emotion.Peer Reviewe