455 research outputs found

    Measuring Engagement in Robot-Assisted Autism Therapy: A Cross-Cultural Study

    Get PDF
    During occupational therapy for children with autism, it is often necessary to elicit and maintain engagement for the children to benefit from the session. Recently, social robots have been used for this; however, existing robots lack the ability to autonomously recognize the children’s level of engagement, which is necessary when choosing an optimal interaction strategy. Progress in automated engagement reading has been impeded in part due to a lack of studies on child-robot engagement in autism therapy. While it is well known that there are large individual differences in autism, little is known about how these vary across cultures. To this end, we analyzed the engagement of children (age 3–13) from two different cultural backgrounds: Asia (Japan, n = 17) and Eastern Europe (Serbia, n = 19). The children participated in a 25 min therapy session during which we studied the relationship between the children’s behavioral engagement (task-driven) and different facets of affective engagement (valence and arousal). Although our results indicate that there are statistically significant differences in engagement displays in the two groups, it is difficult to make any causal claims about these differences due to the large variation in age and behavioral severity of the children in the study. However, our exploratory analysis reveals important associations between target engagement and perceived levels of valence and arousal, indicating that these can be used as a proxy for the children’s engagement during the therapy. We provide suggestions on how this can be leveraged to optimize social robots for autism therapy, while taking into account cultural differences.MEXT Grant-in-Aid for Young Scientists B (grant no. 16763279)Chubu University Grant I (grant no. 27IS04I (Japan))European Union. HORIZON 2020 (grant agreement no. 701236 (ENGAGEME))European Commission. Framework Programme for Research and Innovation. Marie Sklodowska-Curie Actions (Individual Fellowship)European Commission. Framework Programme for Research and Innovation. Marie Sklodowska-Curie Actions (grant agreement no. 688835 (DE-ENIGMA)

    Measuring Engagement in Robot-Assisted Autism Therapy: A Cross-Cultural Study

    Get PDF
    During occupational therapy for children with autism, it is often necessary to elicit and maintain engagement for the children to benefit from the session. Recently, social robots have been used for this; however, existing robots lack the ability to autonomously recognize the children’s level of engagement, which is necessary when choosing an optimal interaction strategy. Progress in automated engagement reading has been impeded in part due to a lack of studies on child-robot engagement in autism therapy. While it is well known that there are large individual differences in autism, little is known about how these vary across cultures. To this end, we analyzed the engagement of children (age 3–13) from two different cultural backgrounds: Asia (Japan, n = 17) and Eastern Europe (Serbia, n = 19). The children participated in a 25 min therapy session during which we studied the relationship between the children’s behavioral engagement (task-driven) and different facets of affective engagement (valence and arousal). Although our results indicate that there are statistically significant differences in engagement displays in the two groups, it is difficult to make any causal claims about these differences due to the large variation in age and behavioral severity of the children in the study. However, our exploratory analysis reveals important associations between target engagement and perceived levels of valence and arousal, indicating that these can be used as a proxy for the children’s engagement during the therapy. We provide suggestions on how this can be leveraged to optimize social robots for autism therapy, while taking into account cultural differences.MEXT Grant-in-Aid for Young Scientists B (grant no. 16763279)Chubu University Grant I (grant no. 27IS04I (Japan))European Union. HORIZON 2020 (grant agreement no. 701236 (ENGAGEME))European Commission. Framework Programme for Research and Innovation. Marie Sklodowska-Curie Actions (Individual Fellowship)European Commission. Framework Programme for Research and Innovation. Marie Sklodowska-Curie Actions (grant agreement no. 688835 (DE-ENIGMA)

    Requirements for Robotic Interpretation of Social Signals “in the Wild”: Insights from Diagnostic Criteria of Autism Spectrum Disorder

    Get PDF
    The last few decades have seen widespread advances in technological means to characterise observable aspects of human behaviour such as gaze or posture. Among others, these developments have also led to significant advances in social robotics. At the same time, however, social robots are still largely evaluated in idealised or laboratory conditions, and it remains unclear whether the technological progress is sufficient to let such robots move “into the wild”. In this paper, we characterise the problems that a social robot in the real world may face, and review the technological state of the art in terms of addressing these. We do this by considering what it would entail to automate the diagnosis of Autism Spectrum Disorder (ASD). Just as for social robotics, ASD diagnosis fundamentally requires the ability to characterise human behaviour from observable aspects. However, therapists provide clear criteria regarding what to look for. As such, ASD diagnosis is a situation that is both relevant to real-world social robotics and comes with clear metrics. Overall, we demonstrate that even with relatively clear therapist-provided criteria and current technological progress, the need to interpret covert behaviour cannot yet be fully addressed. Our discussions have clear implications for ASD diagnosis, but also for social robotics more generally. For ASD diagnosis, we provide a classification of criteria based on whether or not they depend on covert information and highlight present-day possibilities for supporting therapists in diagnosis through technological means. For social robotics, we highlight the fundamental role of covert behaviour, show that the current state-of-the-art is unable to characterise this, and emphasise that future research should tackle this explicitly in realistic settings

    From Robot-Assisted Intervention to New Generation of Autism Screening: an Engineering Implementation Beyond the Technical Approach

    Get PDF
    Autism spectrum disorder (ASD) is a neurodevelopmental disorder that affects people from birth, whose symptoms are found in the early developmental period. The ASD diagnosis is usually performed through several sessions of behavioral observation, exhaustive screening, and manual coding behavior. The early detection of ASD signs in naturalistic behavioral observation may be improved through Social Assistive Robotics (SAR) and technological-based tools for an automated behavior assessment. Robot-assisted tools using Child-Robot Interaction (CRI) theories have been of interest in intervention for children with Autism Spectrum Disorder (CwASD), elucidating faster and more significant gains from the diagnosis and therapeutic intervention when compared with classical methods. Additionally, using computer vision to analyze the childs behaviors and automated video coding to summarize the responses would help clinicians to reduce the delay of ASD diagnosis. Despite the increment of researches related to SAR, achieving a plausible Robot-Assisted Diagnosis (RAD) for CwASD remains a considerable challenge to the clinical and robotics community. The work of specialists regarding ASD diagnosis is hard and labor-intensive, as the conditions manifestations are inherently heterogeneous and make the process more difficult. In addition, the aforementioned complexity may be the main reason for the slow progress in the development of SAR with diagnostic purpose. Also, there still is a lack of guidelines on how to select the appropriate robotic features, such as appearance, morphology, autonomy level, and how to design and implement the robots role in the CRI. Thus, this Ph.D. Thesis provides a comprehensive Robot-Assisted intervention for CwASD to assess autism risk factors for an autism diagnostic purpose. More specifically, two studies were conducted to analyze and validate the system performance. Through statistical data analysis, different behavior pattern of the CwASD group were identified, which suggest that these patterns can be used to detect autism risk factors through robot-based interventions. To increase the scope of this research, a theoretical conceptualization of the pervasive version of the multimodal environment was described as well as a participatory design methodology was designed and implemented on the Colombian autism community, providing, a set of guidelines regarding the design of a social robot-device suitable to be applied for robot-assisted intervention for CwASD

    Emotions Recognition in people with Autism using Facial Expressions and Machine Learning Techniques: Survey

    Get PDF
    في الآونة الأخيرة ، اهتمت الكثير من الدراسات بالتعرف على المشاعر واكتشافها لدى الأشخاص المصابين بالتوحد. الهدف الرئيسي من هذه الورقة هو مسح الدراسات المختلفة التي تتعلق بالحالة العاطفية للأشخاص المصابين بالتوحد. يتضمن الاستطلاع جزأين ، يركز الجزء الأول على الدراسات التي استخدمت تعابير الوجه للتعرف على المشاعر واكتشافها. حيث تعتبر تعبيرات الوجه من التقنيات العاطفية المهمة التي تستخدم للتعبير عن أنماط مختلفة من المشاعر. ركزت الأجزاء الثانية من هذه الدراسة على الأساليب التقنية المختلفة مثل التعلم الآلي والتعلم العميق والخوارزميات الأخرى التي تستخدم لتحليل وتحديد سلوكيات الوجه للأشخاص المصابين بالتوحد. للعثور على الحل الأمثل ، يتم من خلال التحقيق في مقارنة أنظمة الكشف عن المشاعر الحالية في هذه الورقة.Recently, a lot of studies have been interested in recognizing and detection of emotions in people with autism.  The main goal of this paper is to survey different studies which have been concerned emotional state of people with autism.  The survey includes two parts, first one focused on studies which use facial expressions to recognize and detect emotions. As facial expressions are considered the affective and important techniques which is used to express different patterns of emotions.  Second parts of this study, focuses on different technical methods like machine learning, deep learning and other algorithms that are employed to analyze and determine the facial behaviors of people with autism. To find the optimal solution, a comparison of current emotion-detecting systems is investigated in this paper

    Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder

    Get PDF
    The aim of this study was to increase knowledge on therapy and educational objectives professionals work on with children with autism spectrum disorder (ASD) and to identify corresponding state of the art robots. Focus group sessions (n = 9) with ASD professionals (n = 53) from nine organisations were carried out to create an objectives overview, followed by a systematic literature study to identify state of the art robots matching these objectives. Professionals identified many ASD objectives (n = 74) in 9 different domains. State of the art robots addressed 24 of these objectives in 8 domains. Robots can potentially be applied to a large scope of objectives for children with ASD. This objectives overview functions as a base to guide development of robot interventions for these children

    The DREAM Dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy

    Get PDF
    We present a dataset of behavioral data recorded from 61 children diagnosed with Autism Spectrum Disorder (ASD). The data was collected during a large-scale evaluation of Robot Enhanced Therapy (RET). The dataset covers over 3000 therapy sessions and more than 300 hours of therapy. Half of the children interacted with the social robot NAO supervised by a therapist. The other half, constituting a control group, interacted directly with a therapist. Both groups followed the Applied Behavior Analysis (ABA) protocol. Each session was recorded with three RGB cameras and two RGBD (Kinect) cameras, providing detailed information of children’s behavior during therapy. This public release of the dataset comprises body motion, head position and orientation, and eye gaze variables, all specified as 3D data in a joint frame of reference. In addition, metadata including participant age, gender, and autism diagnosis (ADOS) variables are included. We release this data with the hope of supporting further data-driven studies towards improved therapy methods as well as a better understanding of ASD in general.CC BY 4.0DREAM - Development of robot-enhanced therapy for children with autism spectrum disorders

    An Analysis of Robot-Assisted Social-Communication Instruction for Young Children with Autism Spectrum Disorders

    Get PDF
    Social and communication deficits are a core feature of Autism Spectrum Disorders (ASD) and impact an individual\u27s ability to be a full participant in their school environment and community. The increase in number of students with ASD in schools combined with the use of ineffective interventions have created a critical need for quality social-communication instruction in schools for this population. Technology-based interventions, like robots, have the potential to greatly impact students with disabilities, including students with ASD who tend to show increased interest and engagement in technology-based tasks and materials. While research on the use of robots with these learners is limited, these technologies have been successfully used to teach basic social-communication skills. The purpose of this study was to examine the effects of a social-communication intervention for young children with ASD that is rooted in evidence-based practices and utilizes a surrogate interactive robot as the primary interventionist. This study utilized a multiple baseline design across behaviors to determine the impact of the robot-assisted intervention on the manding, tacting, and intraverbal skills of four, 3-year old students with ASD. The researchers found that this intervention was effective in increasing the rate of all three the target behaviors

    Spontaneous Facial Behavior Computing in Human Machine Interaction with Applications in Autism Treatment

    Get PDF
    Digital devices and computing machines such as computers, hand-held devices and robots are becoming an important part of our daily life. To have affect-aware intelligent Human-Machine Interaction (HMI) systems, scientists and engineers have aimed to design interfaces which can emulate face-to-face communication. Such HMI systems are capable of detecting and responding upon users\u27 emotions and affective states. One of the main challenges for producing such intelligent system is to design a machine, which can automatically compute spontaneous behaviors of humans in real-life settings. Since humans\u27 facial behaviors contain important non-verbal cues, this dissertation studies facial actions and behaviors in HMI systems. The main two objectives of this dissertation are: 1- capturing, annotating and computing spontaneous facial expressions in a Human-Computer Interaction (HCI) system and releasing a database that allows researchers to study the dynamics of facial muscle movements in both posed and spontaneous data. 2- developing and deploying a robot-based intervention protocol for autism therapeutic applications and modeling facial behaviors of children with high-functioning autism in a real-world Human-Robot Interaction (HRI) system. Because of the lack of data for analyzing the dynamics of spontaneous facial expressions, my colleagues and I introduced and released a novel database called Denver Intensity of Spontaneous Facial Actions (DISFA) . DISFA describes facial expressions using Facial Action Coding System (FACS) - a gold standard technique which annotates facial muscle movements in terms of a set of defined Action Units (AUs). This dissertation also introduces an automated system for recognizing DISFA\u27s facial expressions and dynamics of AUs in a single image or sequence of facial images. Results illustrate that our automated system is capable of computing AU dynamics with high accuracy (overall reliability ICC = 0.77). In addition, this dissertation investigates and computes the dynamics and temporal patterns of both spontaneous and posed facial actions, which can be used to automatically infer the meaning of facial expressions. Another objective of this dissertation is to analyze and compute facial behaviors (i.e. eye gaze and head orientation) of individuals in real-world HRI system. Due to the fact that children with Autism Spectrum Disorder (ASD) show interest toward technology, we designed and conducted a set of robot-based games to study and foster the socio-behavioral responses of children diagnosed with high-functioning ASD. Computing the gaze direction and head orientation patterns illustrate how individuals with ASD regulate their facial behaviors differently (compared to typically developing children) when interacting with a robot. In addition, studying the behavioral responses of participants during different phases of this study (i.e. baseline, intervention and follow-up) reveals that overall, a robot-based therapy setting can be a viable approach for helping individuals with autism
    corecore