1,275 research outputs found

    Using a humanoid robot as the promoter of interaction with children in the context of educational games

    Get PDF
    Society should care about those with special needs. Part of a proper care involves the development of new technologies and devices aiming at improving their quality of life. Research conducted at universities on this subject should be followed by the industrial development of commercial products and governmental institutions may play an important role by establishing conditions ensuring that the results are made available to those who need them. This paper presents the details of a system, still at the early stages of research level, aimed at helping children with Autism Spectrum Disorder (ASD). It uses ZECA, a humanoid robot Zeno R-50, acting as the promoter of the interaction with children, by teaching colours and geometric figures in the context of two educational game scenarios: identification of five geometric figures and identification of five colours. So far, the system was tested in a school environment with typically developing children, in order to validate the experimental setup and the game design. The results obtained in these tests allowed optimizing the system before starting the work in elementary schools with children with ASD, which is the next step in the research.The authors also would like to express their acknowledgments to COMPETE: POCI-01-0145- FEDER-007043 and FCT – Fundação para a Ciência e Tecnologia within the Project Scope: UID/CEC/00319/2013.This work is funded by CIEd – Research Centre on Education, projects UID/CED/1661/2013 and UID/CED/1661/2016, Institute of Education, University of Minho, through national funds of FCT/MCTES-PT.info:eu-repo/semantics/publishedVersio

    Applications of Robotics for Autism Spectrum Disorder: a Scoping Review

    Get PDF
    Robotic therapies are receiving growing interest in the autism field, especially for the improvement of social skills of children, enhancing traditional human interventions. In this work, we conduct a scoping review of the literature in robotics for autism, providing the largest review on this field from the last five years. Our work underlines the need to better characterize participants and to increase the sample size. It is also important to develop homogeneous training protocols to analyse and compare the results. Nevertheless, 7 out of the 10 Randomized control trials reported a significant impact of robotic therapy. Overall, robot autonomy, adaptability and personalization as well as more standardized outcome measures were pointed as the most critical issues to address in future research

    How to build a supervised autonomous system for robot-enhanced therapy for children with autism spectrum disorder

    Get PDF
    Robot-Assisted Therapy (RAT) has successfully been used to improve social skills in children with autism spectrum disorders (ASD) through remote control of the robot in so-called Wizard of Oz (WoZ) paradigms.However, there is a need to increase the autonomy of the robot both to lighten the burden on human therapists (who have to remain in control and, importantly, supervise the robot) and to provide a consistent therapeutic experience. This paper seeks to provide insight into increasing the autonomy level of social robots in therapy to move beyond WoZ. With the final aim of improved human-human social interaction for the children, this multidisciplinary research seeks to facilitate the use of social robots as tools in clinical situations by addressing the challenge of increasing robot autonomy.We introduce the clinical framework in which the developments are tested, alongside initial data obtained from patients in a first phase of the project using a WoZ set-up mimicking the targeted supervised-autonomy behaviour. We further describe the implemented system architecture capable of providing the robot with supervised autonomy

    Assessing the Effectiveness of Automated Emotion Recognition in Adults and Children for Clinical Investigation

    Get PDF
    Recent success stories in automated object or face recognition, partly fuelled by deep learning artificial neural network (ANN) architectures, has led to the advancement of biometric research platforms and, to some extent, the resurrection of Artificial Intelligence (AI). In line with this general trend, inter-disciplinary approaches have taken place to automate the recognition of emotions in adults or children for the benefit of various applications such as identification of children emotions prior to a clinical investigation. Within this context, it turns out that automating emotion recognition is far from being straight forward with several challenges arising for both science(e.g., methodology underpinned by psychology) and technology (e.g., iMotions biometric research platform). In this paper, we present a methodology, experiment and interesting findings, which raise the following research questions for the recognition of emotions and attention in humans: a) adequacy of well-established techniques such as the International Affective Picture System (IAPS), b) adequacy of state-of-the-art biometric research platforms, c) the extent to which emotional responses may be different among children or adults. Our findings and first attempts to answer some of these research questions, are all based on a mixed sample of adults and children, who took part in the experiment resulting into a statistical analysis of numerous variables. These are related with, both automatically and interactively, captured responses of participants to a sample of IAPS pictures

    Motion and emotion estimation for robotic autism intervention.

    Get PDF
    Robots have recently emerged as a novel approach to treating autism spectrum disorder (ASD). A robot can be programmed to interact with children with ASD in order to reinforce positive social skills in a non-threatening environment. In prior work, robots were employed in interaction sessions with ASD children, but their sensory and learning abilities were limited, while a human therapist was heavily involved in “puppeteering” the robot. The objective of this work is to create the next-generation autism robot that includes several new interactive and decision-making capabilities that are not found in prior technology. Two of the main features that this robot would need to have is the ability to quantitatively estimate the patient’s motion performance and to correctly classify their emotions. This would allow for the potential diagnosis of autism and the ability to help autistic patients practice their skills. Therefore, in this thesis, we engineered components for a human-robot interaction system and confirmed them in experiments with the robots Baxter and Zeno, the sensors Empatica E4 and Kinect, and, finally, the open-source pose estimation software OpenPose. The Empatica E4 wristband is a wearable device that collects physiological measurements in real time from a test subject. Measurements were collected from ASD patients during human-robot interaction activities. Using this data and labels of attentiveness from a trained coder, a classifier was developed that provides a prediction of the patient’s level of engagement. The classifier outputs this prediction to a robot or supervising adult, allowing for decisions during intervention activities to keep the attention of the patient with autism. The CMU Perceptual Computing Lab’s OpenPose software package enables body, face, and hand tracking using an RGB camera (e.g., web camera) or an RGB-D camera (e.g., Microsoft Kinect). Integrating OpenPose with a robot allows the robot to collect information on user motion intent and perform motion imitation. In this work, we developed such a teleoperation interface with the Baxter robot. Finally, a novel algorithm, called Segment-based Online Dynamic Time Warping (SoDTW), and metric are proposed to help in the diagnosis of ASD. Social Robot Zeno, a childlike robot developed by Hanson Robotics, was used to test this algorithm and metric. Using the proposed algorithm, it is possible to classify a subject’s motion into different speeds or to use the resulting SoDTW score to evaluate the subject’s abilities

    Behavioural attentiveness patterns analysis – detecting distraction behaviours

    Get PDF
    The capacity of remaining focused on a task can be crucial in some circumstances. In general, this ability is intrinsic in a human social interaction and it is naturally used in any social context. Nevertheless, some individuals have difficulties in remaining concentrated in an activity, resulting in a short attention span. Children with Autism Spectrum Disorder (ASD) are a special example of such individuals. ASD is a group of complex developmental disorders of the brain. Individuals affected by this disorder are characterized by repetitive patterns of behaviour, restricted activities or interests, and impairments in social communication. The use of robots has already proved to encourage the developing of social interaction skills lacking in children with ASD. However, most of these systems are controlled remotely and cannot adapt automatically to the situation, and even those who are more autonomous still cannot perceive whether or not the user is paying attention to the instructions and actions of the robot. Following this trend, this dissertation is part of a research project that has been under development for some years. In this project, the Robot ZECA (Zeno Engaging Children with Autism) from Hanson Robotics is used to promote the interaction with children with ASD helping them to recognize emotions, and to acquire new knowledge in order to promote social interaction and communication with the others. The main purpose of this dissertation is to know whether the user is distracted during an activity. In the future, the objective is to interface this system with ZECA to consequently adapt its behaviour taking into account the individual affective state during an emotion imitation activity. In order to recognize human distraction behaviours and capture the user attention, several patterns of distraction, as well as systems to automatically detect them, have been developed. One of the most used distraction patterns detection methods is based on the measurement of the head pose and eye gaze. The present dissertation proposes a system based on a Red Green Blue (RGB) camera, capable of detecting the distraction patterns, head pose, eye gaze, blinks frequency, and the user to position towards the camera, during an activity, and then classify the user's state using a machine learning algorithm. Finally, the proposed system is evaluated in a laboratorial and controlled environment in order to verify if it is capable to detect the patterns of distraction. The results of these preliminary tests allowed to detect some system constraints, as well as to validate its adequacy to later use it in an intervention setting.A capacidade de permanecer focado numa tarefa pode ser crucial em algumas circunstâncias. No geral, essa capacidade é intrínseca numa interação social humana e é naturalmente usada em qualquer contexto social. No entanto, alguns indivíduos têm dificuldades em permanecer concentrados numa atividade, resultando num curto período de atenção. Crianças com Perturbações do Espectro do Autismo (PEA) são um exemplo especial de tais indivíduos. PEA é um grupo de perturbações complexas do desenvolvimento do cérebro. Os indivíduos afetados por estas perturbações são caracterizados por padrões repetitivos de comportamento, atividades ou interesses restritos e deficiências na comunicação social. O uso de robôs já provaram encorajar a promoção da interação social e ajudaram no desenvolvimento de competências deficitárias nas crianças com PEA. No entanto, a maioria desses sistemas é controlada remotamente e não consegue-se adaptar automaticamente à situação, e mesmo aqueles que são mais autônomos ainda não conseguem perceber se o utilizador está ou não atento às instruções e ações do robô. Seguindo esta tendência, esta dissertação é parte de um projeto de pesquisa que vem sendo desenvolvido há alguns anos, onde o robô ZECA (Zeno Envolvendo Crianças com Autismo) da Hanson Robotics é usado para promover a interação com crianças com PEA, ajudando-as a reconhecer emoções, adquirir novos conhecimentos para promover a interação social e comunicação com os pares. O principal objetivo desta dissertação é saber se o utilizador está distraído durante uma atividade. No futuro, o objetivo é fazer a interface deste sistema com o ZECA para, consequentemente, adaptar o seu comportamento tendo em conta o estado afetivo do utilizador durante uma atividade de imitação de emoções. A fim de reconhecer os comportamentos de distração humana e captar a atenção do utilizador, vários padrões de distração, bem como sistemas para detetá-los automaticamente, foram desenvolvidos. Um dos métodos de deteção de padrões de distração mais utilizados baseia-se na medição da orientação da cabeça e da orientação do olhar. A presente dissertação propõe um sistema baseado numa câmera Red Green Blue (RGB), capaz de detetar os padrões de distração, orientação da cabeça, orientação do olhar, frequência do piscar de olhos e a posição do utilizador em frente da câmera, durante uma atividade, e então classificar o estado do utilizador usando um algoritmo de “machine learning”. Por fim, o sistema proposto é avaliado num ambiente laboratorial, a fim de verificar se é capaz de detetar os padrões de distração. Os resultados destes testes preliminares permitiram detetar algumas restrições do sistema, bem como validar a sua adequação para posteriormente utilizá-lo num ambiente de intervenção

    Effect of sensory-based technologies on atypical sensory responses of children with Autism Spectrum Disorder: A systematic review

    Get PDF
    © 2021 ACM, Inc. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1145/3485768.3485782.Atypical sensory responses are one of the most common issues observed in Autism Spectrum Disorder (ASD), affecting the development of a child's capability for social interaction, independent living and learning. In the past two decades, there has been a growing number of studies of technology-based interventions for atypical sensory responses of individuals with ASD. However, their effects and limitations have not been fully examined. This systematic review investigates the effects of sensory-based technologies (SBTs) on atypical sensory responses of children with ASD. Publications that report on the use of a SBT as an intervention tool were retrieved from four academic databases: “PubMed”, “IEEE Xplore”, “ACM Digital Library” and “Web of Science”. The search finally yielded 18 articles. The results indicated an emerging trend of studies investigating the effects of SBTs on atypical sensory responses over the past decade. Challenges and limitations were found in studies, mainly because the literatures adopted different methods and indicators, small sample sizes, and varying experimental designs. Findings were that the use of SBTs could effectively improve auditory and visual recognition, and some other behavioural outcomes such as attention in children with ASD. Future development of SBTs could further integrate more advanced techniques, such as machine learning, in order to widen the scope of SBTs usage to help more ASD children

    Enhance the Language Ability of Humanoid Robot NAO through Deep Learning to Interact with Autistic Children

    Get PDF
    Autism spectrum disorder (ASD) is a life-long neurological disability, and a cure has not yet been found. ASD begins early in childhood and lasts throughout a person’s life. Through early intervention, many actions can be taken to improve the quality of life of children. Robots are one of the best choices for accompanying children with autism. However, for most robots, the dialogue system uses traditional techniques to produce responses. Robots cannot produce meaningful answers when the conversations have not been recorded in a database. The main contribution of our work is the incorporation of a conversation model into an actual robot system for supporting children with autism. We present the use a neural network model as the generative conversational agent, which aimed at generating meaningful and coherent dialogue responses given the dialogue history. The proposed model shares an embedding layer between the encoding and decoding processes through adoption. The model is different from the canonical Seq2Seq model in which the encoder output is used only to set-up the initial state of the decoder to avoid favoring short and unconditional responses with high prior probability. In order to improve the sensitivity to context, we changed the input method of the model to better adapt to the utterances of children with autism. We adopted transfer learning to make the proposed model learn the characteristics of dialogue with autistic children and to solve the problem of the insufficient corpus of dialogue. Experiments showed that the proposed method was superior to the canonical Seq2sSeq model and the GAN-based dialogue model in both automatic evaluation indicators and human evaluation, including pushing the BLEU precision to 0.23, the greedy matching score to 0.69, the embedding average score to 0.82, the vector extrema score to 0.55, the skip-thought score to 0.65, the KL divergence score to 5.73, and the EMD score to 12.21
    corecore