4 research outputs found

    Differences in Spontaneous Interactions of Autistic Children in an Interaction With an Adult and Humanoid Robot

    Get PDF
    Robots are promising tools for promoting engagement of autistic children in interventions and thereby increasing the amount of learning opportunities. However, designing deliberate robot behavior aimed at engaging autistic children remains challenging. Our current understanding of what interactions with a robot, or facilitated by a robot, are particularly motivating to autistic children is limited to qualitative reports with small sample sizes. Translating insights from these reports to design is difficult due to the large individual differences among autistic children in their needs, interests, and abilities. To address these issues, we conducted a descriptive study and report on an analysis of how 31 autistic children spontaneously interacted with a humanoid robot and an adult within the context of a robot-assisted intervention, as well as which individual characteristics were associated with the observed interactions. For this analysis, we used video recordings of autistic children engaged in a robot-assisted intervention that were recorded as part of the DE-ENIGMA database. The results showed that the autistic children frequently engaged in exploratory and functional interactions with the robot spontaneously, as well as in interactions with the adult that were elicited by the robot. In particular, we observed autistic children frequently initiating interactions aimed at making the robot do a certain action. Autistic children with stronger language ability, social functioning, and fewer autism spectrum-related symptoms, initiated more functional interactions with the robot and more robot-elicited interactions with the adult. We conclude that the children's individual characteristics, in particular the child's language ability, can be indicative of which types of interaction they are more likely to find interesting. Taking these into account for the design of deliberate robot behavior, coupled with providing more autonomy over the robot's behavior to the autistic children, appears promising for promoting engagement and facilitating more learning opportunities

    Things that Make Robots Go HMMM : Heterogeneous Multilevel Multimodal Mixing to Realise Fluent, Multiparty, Human-Robot Interaction

    Get PDF
    Fluent, multi-party, human-robot interaction calls for the mixing of deliberate conversational behaviour and re- active, semi-autonomous behaviour. In this project, we worked on a novel, state-of-the-art setup for realising such interactions. We approach this challenge from two sides. On the one hand, a dialogue manager requests deliberative behaviour and setting parameters on ongoing (semi)autonomous behaviour. On the other hand, robot control software needs to translate and mix these deliberative and bottom-up behaviours into consistent and coherent motion. The two need to collaborate to create behaviour that is fluent, naturally varied, and well-integrated. The resulting challenge is that, at the same time, this behaviour needs to conform to both high level requirements and to content and timing that are set by the dialogue manager. We tackled this challenge by designing a framework which can mix these two types of behaviour, using AsapRealizer, a Behaviour Markup Language realiser. We call this Heterogeneous Multilevel Mul- timodal Mixing (HMMM). Our framework is showcased in a scenario which revolves around a robot receptionist which is able to interact with multiple users

    Predictable robots for autistic children: variance in robot behaviour, idiosyncrasies in autistic children's characteristics, and child–robot engagement

    Get PDF
    Predictability is important to autistic individuals, and robots have been suggested to meet this need as they can be programmed to be predictable, as well as elicit social interaction. The effectiveness of robot-assisted interventions designed for social skill learning presumably depends on the interplay between robot predictability, engagement in learning, and the individual differences between different autistic children. To better understand this interplay, we report on a study where 24 autistic children participated in a robot-assisted intervention. We manipulated the variance in the robot’s behaviour as a way to vary predictability, and measured the children’s behavioural engagement, visual attention, as well as their individual factors. We found that the children will continue engaging in the activity behaviourally, but may start to pay less visual attention over time to activity-relevant locations when the robot is less predictable. Instead, they increasingly start to look away from the activity. Ultimately, this could negatively influence learning, in particular for tasks with a visual component. Furthermore, severity of autistic features and expressive language ability had a significant impact on behavioural engagement. We consider our results as preliminary evidence that robot predictability is an important factor for keeping children in a state where learning can occur

    "Are you sad, Cozmo?" How humans make sense of a home robot's emotion displays

    No full text
    This paper explores how humans interpret displays of emotion pro- duced by a social robot in real world situated interaction. Taking a multimodal conversation analytic approach, we analyze video data of families interacting with a Cozmo robot in their homes. Focusing on one happy and one sad robot animation, we study, on a turn-by-turn basis, how participants respond to audible and visible robot behavior designed to display emotion. We show how emotion animations are consequential for interactional progres- sivity: While displays of happiness typically move the interaction forward, displays of sadness regularly lead to a reconsideration of previous actions by humans. Furthermore, in making sense of the robot animations people may move beyond the designer’s re- ported intentions, actually broadening the opportunities for their subsequent engagement. We discuss how sadness functions as an interactional "rewind button" and how the inherent vagueness of emotion displays can be deployed in design.Funding agencies:  Swedish Research CouncilSwedish Research Council [2016-00827]</p
    corecore