1 research outputs found
Using Robots as Therapeutic Agents to Teach Children with Autism Recognize Facial Expression
Background: Recognizing and mimicking facial expressions are important cues for building great rapport and relationship in human-human communication. Individuals with Autism Spectrum Disorder (ASD) have often deficits in recognizing and mimicking social cues, such as facial expressions. In the last decade several studies have shown that individuals with ASD have superior engagement toward objects and particularly robots (i.e. humanoid and non-humanoid). However, majority of the studies have focused on investigating robotâs appearances and the engineering design concepts and very few research have been done on the effectiveness of robots in therapeutic and treatment applications. In fact, the critical question that âhow robots can help individuals with autism to practice and learn some social communicational skills and applied them in their daily interactionsâ have not been addressed yet.
Objective: In a multidisciplinary research study we have explored how robot-based therapeutic sessions can be effective and to some extent they can improve the social-experiences of children with ASD. We developed and executed a robot-based multi-session therapeutic protocol which consists of three phases (i.e. baseline, Intervention and human-validation sessions) that can serve as a treatment mechanism for individuals with ASD.
Methods: We recruited seven (2F/5M) children 6-13 years old (Mean=10.14 years), diagnosed with High Functioning Autism (HFA). We employed NAO, an autonomous programmable humanoid robot, to interact with children in a series of social games for several sessions. We captured all the visual and audio communications between NAO and the child using multiple cameras. All the capturing devices were connected to a monitoring system outside of the study room, where a coder observed and annotated the responses of the child online. In every session, NAO asked the child to identify the type of prototypic facial expression (i.e. happy, sad, angry, and neutral) shown on five different photos. In the âbaselineâ sessions we calculated the prior knowledge of every child about the emotion and facial expression concepts. In the âinterventionâ sessions, NAO provides some verbal feedback (if needed), to help the child identify the facial expression. After finishing the intervention sessions, we included two âhuman-validationâ sessions (with no feedback) to evaluate how well the child can apply the learned concepts when a human is replaced with NAO.
Results: The following Table demonstrates the mean and Standard Deviation (STD) of face recognition rates for all subjects in three phases of our study. In our experiment six out of seven subjects had baseline recognition rate lower than 80% and we observed high variation (STD) between different subjects.
Facial Expression Recognition Rate (%)
Baseline
Intervention
Human-Validation
Mean (STD)
69.52 (36.28)
85.83 (20.54)
94.28 (15.11)
Conclusions: The results demonstrate the effectiveness of NAO for teaching and improving facial expression recognition (FER) skills by children with ASD. More specifically, in the baseline, the low FER rate (69.52%) with high variability (STD=36.28) demonstrate that overall, participants had difficulty recognizing expressions. The statistical results of intervention phase, confirms that NAO can teach children recognizing facial expressions reliably (higher accuracy with lower STD). Interestingly, in the human-validation phase children could even recognize the basic facial expressions with a higher accuracy (94%) and very limited variability (STD = 15.11). These results conclude that robot-based feedback and intervention with a customized protocol can improve the learning capabilities and social skills of children with ASD