685 research outputs found

    Using Wearable Sensors to Measure Interpersonal Synchrony in Actors and Audience Members During a Live Theatre Performance

    Get PDF
    Studying social interaction in real-world settings is of increasing importance to social cognitive researchers. Theatre provides an ideal opportunity to study rich face-to-face interactions in a controlled, yet natural setting. Here we collaborated with Flute Theatre to investigate interpersonal synchrony between actors-actors, actors-audience and audience-audience within a live theatrical setting. Our 28 participants consisted of 6 actors and 22 audience members, with 5 of these audience members being audience participants in the show. The performance was a compilation of acting, popular science talks and demonstrations, and an audience participation period. Interpersonal synchrony was measured using inertial measurement unit (IMU) wearable accelerometers worn on the heads of participants, whilst audio-visual data recorded everything that occurred on the stage. Participants also completed post-show self-report questionnaires on their engagement with the overall scientists and actors performance. Cross Wavelet Transform (XWT) and Wavelet Coherence Transform (WCT) analysis were conducted to extract synchrony at different frequencies, pairing with audio-visual data. Findings revealed that XWT and WCT analysis are useful methods in extracting the multiple types of synchronous activity that occurs when people perform or watch a live performance together. We also found that audience members with higher ratings on questionnaire items such as the strength of their emotional response to the performance, or how empowered they felt by the performance, showed a high degree of interpersonal synchrony with actors during the acting segments of performance. We further found that audience members rated the scientists performance higher than the actors performance on questions related to their emotional response to the performance as well as, how uplifted, empowered, and connected to social issues they felt. This shows the types of potent connections audience members can have with live performances. Additionally, our findings highlight the importance of the performance context for audience engagement, in our case a theatre performance as part of public engagement with science rather than a stand-alone theatre performance. In sum we conclude that interdisciplinary real-world paradigms are an important and understudied route to understanding in-person social interactions

    onsetsync: An R Package for Onset SynchronyAnalysis

    Get PDF

    Automatic Context-Driven Inference of Engagement in HMI: A Survey

    Full text link
    An integral part of seamless human-human communication is engagement, the process by which two or more participants establish, maintain, and end their perceived connection. Therefore, to develop successful human-centered human-machine interaction applications, automatic engagement inference is one of the tasks required to achieve engaging interactions between humans and machines, and to make machines attuned to their users, hence enhancing user satisfaction and technology acceptance. Several factors contribute to engagement state inference, which include the interaction context and interactants' behaviours and identity. Indeed, engagement is a multi-faceted and multi-modal construct that requires high accuracy in the analysis and interpretation of contextual, verbal and non-verbal cues. Thus, the development of an automated and intelligent system that accomplishes this task has been proven to be challenging so far. This paper presents a comprehensive survey on previous work in engagement inference for human-machine interaction, entailing interdisciplinary definition, engagement components and factors, publicly available datasets, ground truth assessment, and most commonly used features and methods, serving as a guide for the development of future human-machine interaction interfaces with reliable context-aware engagement inference capability. An in-depth review across embodied and disembodied interaction modes, and an emphasis on the interaction context of which engagement perception modules are integrated sets apart the presented survey from existing surveys

    CPG-based Controllers can Trigger the Emergence of Social Synchrony in Human-Robot Interactions

    Get PDF
    International audienceSynchronization is an indissociable part of social interactions between humans, especially in gestural communication. With the emergence of social robotics and assistance robots, it becomes paramount for robots to be socially accepted and for humans to be able to connect with them. As a consequence, synchronization mechanisms should be inherent to any robot controllers, allowing the adaption to the interacting partner in any rhythmic way necessary. In this paper, plastic Central Pattern Generators (CPG) have been implemented in the joints of the robot Pepper that has to learn to wave back at a human partner. Results show that the CPG-based controller leads to adaptive waving synchronized with the human partner, thus proving that the CPG-based controller can achieve synchronization

    Towards modelling group-robot interactions using a qualitative spatial representation

    Get PDF
    This paper tackles the problem of finding a suitable qualitative representation for robots to reason about activity spaces where they carry out tasks interacting with a group of people. The Qualitative Spatial model for Group Robot Interaction (QS-GRI) defines Kendon-formations depending on: (i) the relative location of the robot with respect to other individuals involved in that interaction; (ii) the individuals' orientation; (iii) the shared peri-personal distance; and (iv) the role of the individuals (observer, main character or interactive). The evolution of Kendon-formations between is studied, that is, how one formation is transformed into another. These transformations can depend on the role that the robot have, and on the amount of people involved.Postprint (author's final draft
    • …
    corecore