191 research outputs found
Face2face: advancing the science of social interaction
Face-to-face interaction is core to human sociality and its evolution, and provides the environment in which most of human communication occurs. Research into the full complexities that define face-to-face interaction requires a multi-disciplinary, multi-level approach, illuminating from different perspectives how we and other species interact. This special issue showcases a wide range of approaches, bringing together detailed studies of naturalistic social-interactional behaviour with larger scale analyses for generalization, and investigations of socially contextualized cognitive and neural processes that underpin the behaviour we observe. We suggest that this integrative approach will allow us to propel forwards the science of face-to-face interaction by leading us to new paradigms and novel, more ecologically grounded and comprehensive insights into how we interact with one another and with artificial agents, how differences in psychological profiles might affect interaction, and how the capacity to socially interact develops and has evolved in the human and other species. This theme issue makes a first step into this direction, with the aim to break down disciplinary boundaries and emphasizing the value of illuminating the many facets of face-to-face interaction. This article is part of a discussion meeting issue 'Face2face: advancing the science of social interaction'
Social top-down response modulation (STORM): a model of the control of mimicry in social interaction
As a distinct feature of human social interactions, spontaneous mimicry has been widely investigated in the past decade. Research suggests that mimicry is a subtle and flexible social behavior which plays an important role for communication and affiliation. However, fundamental questions like why and how people mimic still remain unclear. In this paper, we evaluate past theories of why people mimic and the brain systems that implement mimicry in social psychology and cognitive neuroscience. By reviewing recent behavioral and neuroimaging studies on the control of mimicry by social signals, we conclude that the subtlety and sophistication of mimicry in social contexts reflect a social top-down response modulation (STORM) which increases one's social advantage and this mechanism is most likely implemented by medial prefrontal cortex (mPFC). We suggest that this STORM account of mimicry is important for our understanding of social behavior and social cognition, and provides implications for future research in autism
Action understanding requires the left inferior frontal cortex.
Numerous studies have established that inferior frontal cortex is active when hand actions are planned, imagined, remembered, imitated, and even observed. Furthermore, it has been proposed that these activations reflect a process of simulating the observed action to allow it to be understood and thus fully perceived. However, direct evidence for a perceptual role for left inferior frontal cortex is rare, and linguistic or motor contributions to the reported activations have not been ruled out. We used repetitive transcranial magnetic stimulation (rTMS) over inferior frontal gyrus during a perceptual weight-judgement task to test the hypothesis that this region contributes to action understanding. rTMS at this site impaired judgments of the weight of a box lifted by a person, but not judgements of the weight of a bouncing ball or of stimulus duration, and rTMS at control sites had no impact. This demonstrates that the integrity of left inferior frontal gyrus is necessary to make accurate perceptual judgments about other people's actions
Recommended from our members
Social seeking declines in young adolescents
The desire to engage with others is an important motivational force throughout our lifespan. It is known that social behaviour and preferences change from childhood to adulthood, but whether this change is linked with any changes in social motivation is not known. We evaluated 255 typically developing participants from ages 4â20 years on a behavioural paradigm âChoose a Movieâ (CAM). On every trial, participants had a choice between viewing social or non-social movies presented with different levels of effort (key presses/screen touch required). Hence, participants chose not only the movie they would watch but also how much effort they would make. The difference between the effort levels of the chosen and not chosen stimuli helps in quantifying the motivation to seek it. This task could be used with all the age groups with minimal adaptations, allowing comparison between the groups. Results showed that children (4â8 years), older adolescents (12â16 years) and young adults (17â20 years) made more effort to look at social movies. Counterintuitively, this preference was not seen in young adolescents (around 9â12 years), giving a U-shaped developmental trajectory over the population. We present the first evidence for non-monotonic developmental change in social motivation in typical participants
Effects of Being Watched on Eye Gaze and Facial Displays of Typical and Autistic Individuals During Conversation
Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in the autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies
Social signalling as a framework for second-person neuroscience
Despite the recent increase in second-person neuroscience research, it is still hard to understand which neurocognitive mechanisms underlie real-time social behaviours. Here, we propose that social signalling can help us understand social interactions both at the single- and two-brain level in terms of social signal exchanges between senders and receivers. First, we show how subtle manipulations of being watched provide an important tool to dissect meaningful social signals. We then focus on how social signalling can help us build testable hypotheses for second-person neuroscience with the example of imitation and gaze behaviour. Finally, we suggest that linking neural activity to specific social signals will be key to fully understand the neurocognitive systems engaged during face-to-face interactions
Neural responses when learning spatial and object sequencing tasks via imitation
Humans often learn new things via imitation. Here we draw on studies of imitation in children to characterise the brain system(s) involved in the imitation of different sequence types using functional magnetic resonance imaging. On each trial, healthy adult participants learned one of two rule types governing the sequencing of three pictures: a motor-spatial rule (in the spatial task) or an object-based rule (in the cognitive task). Sequences were learned via one of three demonstration types: a video of a hand selecting items in the sequence using a joystick (Hand condition), a computer display highlighting each item in order (Ghost condition), or a text-based demonstration of the sequence (Text condition). Participants then used a joystick to execute the learned sequence. Patterns of activation during demonstration observation suggest specialisation for object-based imitation in inferior frontal gyrus, specialisation for spatial sequences in anterior intraparietal sulcus (IPS), and a general preference for imitation in middle IPS. Adult behavioural performance contrasted with that of children in previous studiesâindicating that they experienced more difficulty with the cognitive taskâwhile neuroimaging results support the engagement of different neural regions when solving these tasks. Further study is needed on whether childrenâs differential performance is related to delayed IPS maturation
Head Nodding and Hand Coordination Across Dyads in Different Conversational Contexts
Patrick Falk, Roser Cañigueral, Jamie A Ward et al. , 03 November 2023, PREPRINT (Version 1) available at Research Square [https://doi.org/10.21203/rs.3.rs-3526068/v1]
This paper aims to explore what different patterns of head nodding and hand movement coordination mean in conversation by recording and analysing interpersonal coordination as it naturally occurs in social interactions. Understanding the timing and at which frequencies such movement behaviours occur can help us answer how and why we use these signals. Here we use high-resolution motion capture to examine three different types of two-person conversation involving different types of information-sharing, in order to explore the potential meaning and coordination of head nodding and hand motion signals. We also test if the tendency to engage in fast or slow nodding behaviour is a fixed personality trait that differs between individuals.
Our results show coordinated slow nodding only in a picture-description task, which implies that this behaviour is not a universal signal of affiliation but is context driven. We also find robust fast nodding behaviour in the two contexts where novel information is exchanged. For hand movement, we find hints of low frequency coordination during one-way information sharing, but found no consistent signalling during information recall. Finally, we show that nodding is consistently driven by context but is not a useful measure of individual differences in social skills. We interpret these results in terms of theories of nonverbal communication and consider how these methods will help advance automated analyses of human conversation behaviours
Autistic adults benefit from and enjoy learning via social interaction as much as neurotypical adults do
Background:
Autistic people show poor processing of social signals (i.e. about the social world). But how do they learn via social interaction?
//
Methods:
68 neurotypical adults and 60 autistic adults learned about obscure items (e.g. exotic animals) over Zoom (i) in a live video-call with the teacher, (ii) from a recorded learner-teacher interaction video and (iii) from a recorded teacher-alone video. Data were analysed via analysis of variance and multi-level regression models.
//
Results:
Live teaching provided the most optimal learning condition, with no difference between groups. Enjoyment was the strongest predictor of learning: both groups enjoyed the live interaction significantly more than other condition and reported similar anxiety levels across conditions.
//
Limitations:
Some of the autistic participants were self-diagnosedâhowever, further analysis where these participants were excluded showed the same results. Recruiting participants over online platforms may have introduced bias in our sample. Future work should investigate learning in social contexts via diverse sources (e.g. schools).
//
Conclusions:
These findings advocate for a distinction between learning about the social versus learning via the social: cognitive models of autism should be revisited to consider social interaction not just as a puzzle to decode but rather a medium through which people, including neuro-diverse groups, learn about the world around them.
//
Trial registration: Part of this work has been pre-registered before data collection https://doi.org/10.17605/OSF.IO/5PGA
A Simple Method for Synchronising Multiple IMUs Using the Magnetometer
This paper presents a novel method to synchronise multiple IMU (inertial measurement units) devices using their onboard magnetometers. The method described uses an external electromagnetic pulse to create a known event measured by the magnetometer of multiple IMUs and in turn used to synchronise these devices. The method is applied to 4 IMU devices decreasing their de-synchronisation from 270ms when using only the RTC (real time clock) to 40ms over a 1 hour recording. It is proposed that this can be further improved to approximately 3ms by increasing the magnetometerâs sample frequency from 25Hz to 300Hz
- âŠ