5 research outputs found

    A Pilot Study on Facial Expression Recognition Ability of Autistic Children Using Ryan, a Rear-Projected Humanoid Robot

    Get PDF
    Rear-projected robots use computer graphics technology to create facial animations and project them on a mask to show the robot’s facial cues and expressions. These types of robots are becoming commercially available, though more research is required to understand how they can be effectively used as a socially assistive robotic agent. This paper presents the results of a pilot study on comparing the facial expression recognition abilities of children with Autism Spectrum Disorder (ASD) with typically developing (TD) children using a rear-projected humanoid robot called Ryan. Six children with ASD and six TD children participated in this research, where Ryan showed them six basic expressions (i.e. anger, disgust, fear, happiness, sadness, and surprise) with different intensity levels. Participants were asked to identify the expressions portrayed by Ryan. The results of our study show that there is not any general impairment in expression recognition ability of the ASD group comparing to the TD control group; however, both groups showed deficiencies in identifying disgust and fear. Increasing the intensity of Ryan’s facial expressions significantly improved the expression recognition accuracy. Both groups were successful to recognize the expressions demonstrated by Ryan with high average accuracy

    Studying Facial Expression Recognition and Imitation Ability of Children with Autism Spectrum Disorder in Interaction with a Social Robot

    Get PDF
    Children with Autism Spectrum Disorder (ASD) experience limited abilities in recognizing non-verbal elements of social interactions such as facial expressions [1]. They also show deficiencies in imitating facial expressions in social situations. In this Master thesis, we focus on studying the ability of children with ASD in recognizing facial expressions and imitating the expressions using a rear-projected expressive humanoid robot, called Ryan. Recent studies show that social robots such as Ryan have great potential for autism therapy. We designed and developed three studies, first to evaluate the ability of children with ASD in recognizing facial expressions that are presented to them with different methods (i.e. robot versus video), and second to determine the effect of various methods on the facial expression imitation performance of children with ASD using Reinforcement Learning (RL). In the first study, we compared the facial expression recognition ability of children with ASD with Typically Developing (TD) children using Ryan. Overall, the results did not show a significant difference between the performance of the ASD and groups in expression recognition. The study revealed the significant effect of increasing the expression intensity level on the expression recognition accuracy. The study also revealed both groups perform significantly worse in recognizing fear and disgust expressions. The second study focused on the effect of context on the facial expression recognition ability of children with ASD compared to their TD peers. The result of this study showed a higher general performance of TD children compared to the ASD group. Within the TD group, fear and in the ASD group sadness were recognized with the lowest accuracy compared to the average accuracy of other expressions. The result of this study did not show any difference between groups; however, we found that there is a significant effect of different background categories in both groups. It means, we found a significant higher recognition accuracy for the negative backgrounds compared to positive backgrounds in 20% intensity for the fear and sadness expressions. In the third study, we designed an active learning method using RL algorithm to identify and adapt based on the individual differences in expression imitation in response to different conditions. We implemented the RL to first, identify the effective imitation method based on individual\u27s performance and preference; and second, to make an online adaptation and adjustment based on the effective method for each individual. The result of this study showed that the active learning method could successfully identify and adjust the session based on participant\u27s strength and preference. The results also showed that each participant responded differently to each method in general and for each expression

    Nyku: A Social Robot for Children With Autism Spectrum Disorders

    Get PDF
    The continued growth of Autism Spectrum Disorders (ASD) around the world has spurred a growth in new therapeutic methods to increase the positive outcomes of an ASD diagnosis. It has been agreed that the early detection and intervention of ASD disorders leads to greatly increased positive outcomes for individuals living with the disorders. Among these new therapeutic methods, Robot-Assisted Therapy (RAT) has become a hot area of study. Recent works have shown that high functioning ASD children have an affinity for interacting with robots versus humans. It is proposed that this is due to a less complex set of communication modes present in a robotic system as opposed to the complex non-verbal communications present in human to human interactions. As such, the Computer Vision and Robotics Lab at the University of Denver has embarked on developing a social robot for children with ASD. This thesis presents the design of this social robot; Nyku (Figure 1). It begins with an investigation of what the needs of ASD children are, what existing therapies help with, and what, if any, roles a robot can play in these treatment plans. From the literature examined, it is clear that robots designed specifically for ASD children have a core set of goals, despite the varied nature of the disorder\u27s spectrum. These goals aim to reduce the stress of non-verbal communications that may occur during standard therapies, as well as providing capabilities to reinforce typical areas of weakness in an ASD persons social repertoire, such as posture mimicry and eye contact. A goal of this thesis is to show the methodology behind arriving at these design goals so that future designers may follow and improve upon them. Nyku\u27s hardware and software design requirements draw from this foundation. Using this needs first design methodology allows for informed design such that the final product is actually useful to the ASD population. In this work, the information collected is used to design the mechanical components of Nyku. These elements consist of Nyku\u27s Body, Neck & Head, and Omni-wheel base. As with all robots, the mechanical needs then spawn electronics requirements, which are, in turn, presented. In order to tie these systems together, the control architecture is coded. Notably, this thesis results in a novel kinematic model of a spherical manipulation system present in the Omni-wheel Base. This solution is then presented in detail, along with the testing conducted to ensure the model\u27s accuracy. To complete the thesis, overall progress on Nyku is highlighted alongside suggestions for a continuation of the work. Here, the engineering work is compared against the design goals which it tries to fulfill in an effort to ensure that the work has stayed on track. In continuation, this examination maps out future steps needed to optimize the engineering work on Nyku for reliable performance during therapeutic sessions. Finally, a therapeutic plan is proposed given the hardware capabilities of Nyku and the needs of ASD children against the background of modern therapeutic methods
    corecore