1,506 research outputs found

    Musical Robots For Children With ASD Using A Client-Server Architecture

    Get PDF
    Presented at the 22nd International Conference on Auditory Display (ICAD-2016)People with Autistic Spectrum Disorders (ASD) are known to have difficulty recognizing and expressing emotions, which affects their social integration. Leveraging the recent advances in interactive robot and music therapy approaches, and integrating both, we have designed musical robots that can facilitate social and emotional interactions of children with ASD. Robots communicate with children with ASD while detecting their emotional states and physical activities and then, make real-time sonification based on the interaction data. Given that we envision the use of multiple robots with children, we have adopted a client-server architecture. Each robot and sensing device plays a role as a terminal, while the sonification server processes all the data and generates harmonized sonification. After describing our goals for the use of sonification, we detail the system architecture and on-going research scenarios. We believe that the present paper offers a new perspective on the sonification application for assistive technologies

    Feasibility of a smartphone application to identify young children at risk for Autism Spectrum Disorder in a low-income community setting in South Africa

    Get PDF
    Introduction and aims More than 90% of children with Autism Spectrum Disorder (ASD) live in low- and middle-income countries (LMIC) where there is a great need for culturally appropriate, scalable and effective early identification and intervention tools. Smartphone technology and application (‘apps’) may potentially play an important role in this regard. The Autism&Beyond iPhone App was designed as a potential screening tool for ASD risk in children aged 12-72 months. Here we investigated the technical feasibility and cultural acceptability of a smartphone app to determine risk for ASD in children aged 12-72 months in a naturalistic, low-income South African community setting. Methodology 37 typically-developing African children and their parents/carers were recruited from community centres in Khayelitsha Township, Cape Town, South Africa. We implemented a mixed-methods design, collecting both quantitative and qualitative data from participants in 2 stages. In stage 1, we collected quantitative data. With appropriate ethics and consent, parents completed a short technology questionnaire about their familiarity with and access to smartphones, internet and apps, followed by electronic iPhone-based demographic and ASD-related questionnaires. Next, children were shown 3 short videos of 30s each and a mirror stimulus on a study smartphone. The smartphone front facing (“selfie”) camera recorded video of the child’s facial expressions and head movement. Automated computer algorithms quantified positive emotions and time attending to stimuli. We validated the automatic coding by a) comparing the computer-generated analysis to human coding of facial expressions in a random sample (N=9), and b) comparing automated analysis of the South African data (N=33) with a matched American sample (N=33). In stage 2, a subset of families were invited to participate in focus group discussions to provide qualitative data on accessibility, acceptability, and cultural appropriateness of the app in their local community. Results Most parents (64%) owned a smartphone of which all (100%) were Android based, and many used Apps (45%). Human-automated coding showed excellent correlation for positive emotion (ICC= 0.95, 95% CI 0.81-0.99) and no statistically significant differences were observed between the South African and American sample in % time attending to the video stimuli. South African children, however, smiled less at the Toys&Rhymes (SA mean (SD) = 14% (24); USA mean (SD) = 31% (34); p=0.05) and Bunny video (SA mean (SD) = 12% (17); USA mean (SD) = 30% (0.27); p=0.006). Analysis of focus group data indicated that parents/carers found the App relatively easy to use, and would recommend it to others in their community provided the App and data transfer were free. Conclusion The results from this pilot study suggested the App to be technically accurate, accessible and culturally acceptable to families from a low-resource environment in South Africa. Given the differences in positive emotional response between the groups, careful consideration should be given to identify suitable stimuli if % time smiling is to be used as a global marker for autism risk across cultures and environments

    A Pilot Study on Facial Expression Recognition Ability of Autistic Children Using Ryan, a Rear-Projected Humanoid Robot

    Get PDF
    Rear-projected robots use computer graphics technology to create facial animations and project them on a mask to show the robot’s facial cues and expressions. These types of robots are becoming commercially available, though more research is required to understand how they can be effectively used as a socially assistive robotic agent. This paper presents the results of a pilot study on comparing the facial expression recognition abilities of children with Autism Spectrum Disorder (ASD) with typically developing (TD) children using a rear-projected humanoid robot called Ryan. Six children with ASD and six TD children participated in this research, where Ryan showed them six basic expressions (i.e. anger, disgust, fear, happiness, sadness, and surprise) with different intensity levels. Participants were asked to identify the expressions portrayed by Ryan. The results of our study show that there is not any general impairment in expression recognition ability of the ASD group comparing to the TD control group; however, both groups showed deficiencies in identifying disgust and fear. Increasing the intensity of Ryan’s facial expressions significantly improved the expression recognition accuracy. Both groups were successful to recognize the expressions demonstrated by Ryan with high average accuracy

    Constraints in the design of activities focusing on emotion recognition for children with ASD using robotic tools

    Get PDF
    RobĂłtica-Autismo project, presented in this paper, aims to identify the main aspects to be considered when working with robots and children with ASD (Autism Spectrum Disorders). Several constraints are identified such as the type of robot, the type of skills that should be developed, the criteria of inclusion and exclusion in the target group, which proceedings should be followed during the sessions and how to analyze the obtained results. In the end, a well-established methodology is achieved in order to accomplish the goal of using a robot as a mediator between children with ASD and other human partners.The authors are grateful to the Portuguese Foundation for Science and Technology, FCT - Fundacao para a Ciencia e a Tecnologia, for funding through the R& D project reference RIPD/ADAlI09407/2009 and the scholarship SFRHIBD/71600/2010. This work is also supported by a QREN initiative, from UEIFEDER ( Fundo Europeu de Desenvolvimento Regional) funds through the "Programa Operacional Factores de Competitividade - COMPETE"

    Evaluation of a robot-assisted therapy for children with autism and intellectual disability

    Get PDF
    It is well established that robots can be suitable assistants in the care and treatment of children with Autism Spectrum Disorder (ASD). However, the majority of the research focuses on stand-alone interventions, high-functioning individuals and the success is evaluated via qualitative analysis of videos recorded during the interaction. In this paper, we present a preliminary evaluation of our on-going research on integrating robot-assisted therapy in the treatment of children with ASD and Intellectual Disability (ID), which is the most common case. The experiment described here integrates a robot-assisted imitation training in the standard treat‐ ment of six hospitalised children with various level of ID, who were engaged by a robot on imitative tasks and their progress assessed via a quantitative psycho- diagnostic tool. Results show success in the training and encourage the use of a robotic assistant in the care of children with ASD and ID with the exception of those with profound ID, who may need a different approach

    SoundFields: A Virtual Reality Game Designed to Address Auditory Hypersensitivity in Individuals with Autism Spectrum Disorder

    Get PDF
    Individuals with autism spectrum disorder (ASD) are characterised as having impairments in social-emotional interaction and communication, alongside displaying repetitive behaviours and interests. Additionally, they can frequently experience difficulties in processing sensory information with particular prevalence in the auditory domain. Often triggered by everyday environmental sounds, auditory hypersensitivity can provoke self-regulatory fear responses such as crying and isolation from sounds. This paper presents SoundFields, an interactive virtual reality game designed to address this area by integrating exposure based therapy techniques into game mechanics and delivering target auditory stimuli to the player rendered via binaural based spatial audio. A pilot study was conducted with six participants diagnosed with ASD who displayed hypersensitivity to specific sounds to evaluate the use of SoundFields as a tool to reduce levels of anxiety associated with identified problematic sounds. During the course of the investigation participants played the game weekly over four weeks and all participants actively engaged with the virtual reality (VR) environment and enjoyed playing the game. Following this period, a comparison of pre- and post-study measurements showed a significant decrease in anxiety linked to target auditory stimuli. The study results therefore suggest that SoundFields could be an effective tool for helping individuals with autism manage auditory hypersensitivity

    Development of application-based learning for autistic students

    Get PDF
    Poor levels of physical activity, cognitive ability, and social interaction are demonstrated in autistic children. These learning barriers in the school environment must be explicitly managed, including in terms of designing objectives, materials, methods, tools, and evaluations to achieve the actual learning objectives. Due to the complex learning barriers in autistic children, conventional learning must be collaborated with assistive technology. Based on these problems, this research aims to develop a drag-and-drop game application equipped with basic motion animation instructions. The research and development method uses the ADDIE model, which consists of five stages. The effectiveness test is carried out with a one-group pretest-posttest design, with treatment for three days and a duration of 105 minutes. With the results of the research based on the results of this overall trial, a score of 81.1% was obtained, which, when viewed from the classification table of product effectiveness scores, was included in the category of very valid for use in learning. The effectiveness test obtained a Sig value. (2-tailed) 0.000 <0.05 means Hₐ is accepted and has a significant effect on student learning outcomes. This development concludes that although the development results can be used well, teachers' ability to manage classrooms remains dominant in maintaining and regulating the learning focus of children with autism. Therefore, further research can develop and collaborate between assistive technology and learning models for physical education that can be applied and adapted to both group and individual physical activities for autistic children

    Expressive visual text-to-speech as an assistive technology for individuals with autism spectrum conditions.

    Get PDF
    Adults with Autism Spectrum Conditions (ASC) experience marked difficulties in recognising the emotions of others and responding appropriately. The clinical characteristics of ASC mean that face to face or group interventions may not be appropriate for this clinical group. This article explores the potential of a new interactive technology, converting text to emotionally expressive speech, to improve emotion processing ability and attention to faces in adults with ASC. We demonstrate a method for generating a near-videorealistic avatar (XpressiveTalk), which can produce a video of a face uttering inputted text, in a large variety of emotional tones. We then demonstrate that general population adults can correctly recognize the emotions portrayed by XpressiveTalk. Adults with ASC are significantly less accurate than controls, but still above chance levels for inferring emotions from XpressiveTalk. Both groups are significantly more accurate when inferring sad emotions from XpressiveTalk compared to the original actress, and rate these expressions as significantly more preferred and realistic. The potential applications for XpressiveTalk as an assistive technology for adults with ASC is discussed.This research was conducted during an international research internship towards an MSc (Res) degree at Maastricht University, funded by Erasmus. This research also received support from the Centre for Psychology, Behaviour and Achievement, Coventry University, UK; the Autism Research Trust; the Medical Research Council UK; and the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care East of England at Cambridgeshire and Peterborough NHS Foundation Trust
    • 

    corecore