431 research outputs found

    An evaluation of an adaptive learning system based on multimodal affect recognition for learners with intellectual disabilities

    Get PDF
    Artificial intelligence tools for education (AIEd) have been used to automate the provision of learning support to mainstream learners. One of the most innovative approaches in this field is the use of data and machine learning for the detection of a student's affective state, to move them out of negative states that inhibit learning, into positive states such as engagement. In spite of their obvious potential to provide the personalisation that would give extra support for learners with intellectual disabilities, little work on AIEd systems that utilise affect recognition currently addresses this group. Our system used multimodal sensor data and machine learning to first identify three affective states linked to learning (engagement, frustration, boredom) and second determine the presentation of learning content so that the learner is maintained in an optimal affective state and rate of learning is maximised. To evaluate this adaptive learning system, 67 participants aged between 6 and 18 years acting as their own control took part in a series of sessions using the system. Sessions alternated between using the system with both affect detection and learning achievement to drive the selection of learning content (intervention) and using learning achievement alone (control) to drive the selection of learning content. Lack of boredom was the state with the strongest link to achievement, with both frustration and engagement positively related to achievement. There was significantly more engagement and less boredom in intervention than control sessions, but no significant difference in achievement. These results suggest that engagement does increase when activities are tailored to the personal needs and emotional state of the learner and that the system was promoting affective states that in turn promote learning. However, longer exposure is necessary to determine the effect on learning

    “Can I be more social with a chatbot?”: social connectedness through interactions of autistic adults with a conversational virtual human

    Get PDF
    The development of AI to function as communicators (i.e. conversational agents), has opened the opportunity to rethink AI’s place within people’s social worlds, and the process of sense-making between humans and machines, especially for people with autism who may stand to benefit from such interactions. The current study aims to explore the interactions of six autistic and six non-autistic adults with a conversational virtual human (CVH/conversational agent/chatbot) over 1-4 weeks. Using semi-structured interviews, conversational chatlogs and post-study online questionnaires, we present findings related to human-chatbot interaction, chatbot humanization/dehumanization and chatbot’s autistic/non-autistic traits through thematic analysis. Findings suggest that although autistic users are willing to converse with the chatbot, there are no indications of relationship development with the chatbot. Our analysis also highlighted autistic users’ expectations of empathy from the chatbot. In the case of the non-autistic users, they tried to stretch the conversational agent’s abilities by continuously testing the AI conversational/cognitive skills. Moreover, non-autistic users were content with Kuki’s basic conversational skills, while on the contrary, autistic participants expected more in-depth conversations, as they trusted Kuki more. The findings offer insights to a new human-chatbot interaction model specifically for users with autism with a view to supporting them via companionship and social connectedness

    Affective Communication for Socially Assistive Robots (SARs) for Children with Autism Spectrum Disorder: A Systematic Review

    Get PDF
    Research on affective communication for socially assistive robots has been conducted to enable physical robots to perceive, express, and respond emotionally. However, the use of affective computing in social robots has been limited, especially when social robots are designed for children, and especially those with autism spectrum disorder (ASD). Social robots are based on cognitiveaffective models, which allow them to communicate with people following social behaviors and rules. However, interactions between a child and a robot may change or be different compared to those with an adult or when the child has an emotional deficit. In this study, we systematically reviewed studies related to computational models of emotions for children with ASD. We used the Scopus, WoS, Springer, and IEEE-Xplore databases to answer different research questions related to the definition, interaction, and design of computational models supported by theoretical psychology approaches from 1997 to 2021. Our review found 46 articles; not all the studies considered children or those with ASD.This research was funded by VRIEA-PUCV, grant number 039.358/202
    • …
    corecore