13 research outputs found

    PELATIHAN ROBOTIKA BERBASIS ANDROID UNTUK MENUMBUHKAN INOVASI DAN KREATIVITAS DI SMP 11 BANDUNG

    Get PDF
    This training aims to create an arduino robot that is controlled using android. The initial study of this research was to design a Proteus simulation to test the performance of robots. Next, design the robot by combining the motor driver, wheel engine and Arduino. Finally, integrate the robot with Android to control the robot's navigation system. This Android-based Arduino robot training will be introduced to students in the robotics workshop at SMP 11 Bandung as a medium for teaching physics or electronics. With this training, it is expected that students will be able to increase innovation and more creativity towards the subjects taught

    Hybrid approach to promote social interaction with children with autism spectrum disorder

    Get PDF
    The comprehension of the emotional state of others is paramount for a successful human interaction. Individuals with Autism Spectrum Disorder (ASD) have impairments in social communication and, consequently, they have difficulties to interpret others’ state of mind. In order to tackle this issue, researchers have been proposing the use of technological solutions to assist children with ASD, particularly in imitation and emotion recognition tasks. Social robots and Objects with Playware Technology (OPT) have been employed as intervention tools with children with ASD. This work presents an approach combining both technologies (robots and OPT), in a hybrid way, with the goal of promoting social interaction with children with ASD. Moreover, a new OPT device was developed to be used as an add-on to the human-robot interaction with children with ASD in two emotion recognition tasks – recognize and storytelling. A pilot study was conducted with children with ASD to evaluate the proposed method. All children successfully participated in the activities. Moreover, children significantly gazed longer towards the OPT during the storytelling scenario as the OPT device displayed visual cues, supporting that using a visual cue may be fundamental in helping children with ASD understand requests and tasks.FCT - Fundação para a Ciência e a Tecnologia(SFRH/BD/133314/2017

    Graded cueing feedback in robot-mediated imitation practice for children with autism spectrum disorders

    Full text link
    A pilot study was conducted examining the effects of a humanoid robot giving the minimum required feedback – graded cueing – during an imitation game played with a child with an autism spectrum disorder (ASD). 12 high-functioning participants with ASD, ages 7 to 10, each played “Copy-Cat ” with a Nao robot 5 times over the span of 2.5 weeks. While the graded cueing model was not exercised in its fullest, using graded cueing-style feedback resulted in a nondecreasing trend in imitative accuracy when compared to a non-adaptive condition where participants always received the same, most descriptive feedback whenever they made a mistake. These trends show promise for future work with robots encouraging autonomy in special needs populations

    A Preliminary Study on Effectiveness of a Standardized Multi-Robot Therapy for Improvement in Collaborative Multi-Human Interaction of Children with ASD

    Get PDF
    This research article presents a preliminary longitudinal study to check the improvement in multi-human communication of children with Autism Spectrum Disorder (ASD) using a standardized multirobot therapy. The research is based on a 3 step framework: 1) Human-Human Interaction, Stage-1 (HHIS1), 2) Human-Robot Interaction, Stage-2 (HRI-S2), and 3) Human-Human Interaction, Stage-3 (HHI-S3). All three stages of the therapy consist of two command sets: 1) Controls commands and 2) Evaluation commands (auditory commands, visual commands, and combination of both). The concept of multiple robots is introduced to help multi-human communication and discourage isolation in ASD children. The joint attention of an ASD child is improved by the robotic therapy in stage 2 considering it as a key parameter for a multi-human communication scenario. The improvement in joint attention results in better command following in a triad multi-human communication scenario in stage 3 as compared to stage 1. The proposed intervention has been tested on 8 ASD subjects with 10 sessions over a period of two and a half months (10 weeks). Each session of human-human interaction (stage 1 and 3) consisted of 14 cues whereas 18 cues were presented by each robot for human-robot interaction (stage 2). The results indicate an overall 86improvement in the social communication skills of ASD children in case of a multi-human scenario. Validation of results and effectiveness of the therapy has been further accomplished through the use of the Childhood Autism Rating Scale (CARS) score

    Behavioural attentiveness patterns analysis – detecting distraction behaviours

    Get PDF
    The capacity of remaining focused on a task can be crucial in some circumstances. In general, this ability is intrinsic in a human social interaction and it is naturally used in any social context. Nevertheless, some individuals have difficulties in remaining concentrated in an activity, resulting in a short attention span. Children with Autism Spectrum Disorder (ASD) are a special example of such individuals. ASD is a group of complex developmental disorders of the brain. Individuals affected by this disorder are characterized by repetitive patterns of behaviour, restricted activities or interests, and impairments in social communication. The use of robots has already proved to encourage the developing of social interaction skills lacking in children with ASD. However, most of these systems are controlled remotely and cannot adapt automatically to the situation, and even those who are more autonomous still cannot perceive whether or not the user is paying attention to the instructions and actions of the robot. Following this trend, this dissertation is part of a research project that has been under development for some years. In this project, the Robot ZECA (Zeno Engaging Children with Autism) from Hanson Robotics is used to promote the interaction with children with ASD helping them to recognize emotions, and to acquire new knowledge in order to promote social interaction and communication with the others. The main purpose of this dissertation is to know whether the user is distracted during an activity. In the future, the objective is to interface this system with ZECA to consequently adapt its behaviour taking into account the individual affective state during an emotion imitation activity. In order to recognize human distraction behaviours and capture the user attention, several patterns of distraction, as well as systems to automatically detect them, have been developed. One of the most used distraction patterns detection methods is based on the measurement of the head pose and eye gaze. The present dissertation proposes a system based on a Red Green Blue (RGB) camera, capable of detecting the distraction patterns, head pose, eye gaze, blinks frequency, and the user to position towards the camera, during an activity, and then classify the user's state using a machine learning algorithm. Finally, the proposed system is evaluated in a laboratorial and controlled environment in order to verify if it is capable to detect the patterns of distraction. The results of these preliminary tests allowed to detect some system constraints, as well as to validate its adequacy to later use it in an intervention setting.A capacidade de permanecer focado numa tarefa pode ser crucial em algumas circunstâncias. No geral, essa capacidade é intrínseca numa interação social humana e é naturalmente usada em qualquer contexto social. No entanto, alguns indivíduos têm dificuldades em permanecer concentrados numa atividade, resultando num curto período de atenção. Crianças com Perturbações do Espectro do Autismo (PEA) são um exemplo especial de tais indivíduos. PEA é um grupo de perturbações complexas do desenvolvimento do cérebro. Os indivíduos afetados por estas perturbações são caracterizados por padrões repetitivos de comportamento, atividades ou interesses restritos e deficiências na comunicação social. O uso de robôs já provaram encorajar a promoção da interação social e ajudaram no desenvolvimento de competências deficitárias nas crianças com PEA. No entanto, a maioria desses sistemas é controlada remotamente e não consegue-se adaptar automaticamente à situação, e mesmo aqueles que são mais autônomos ainda não conseguem perceber se o utilizador está ou não atento às instruções e ações do robô. Seguindo esta tendência, esta dissertação é parte de um projeto de pesquisa que vem sendo desenvolvido há alguns anos, onde o robô ZECA (Zeno Envolvendo Crianças com Autismo) da Hanson Robotics é usado para promover a interação com crianças com PEA, ajudando-as a reconhecer emoções, adquirir novos conhecimentos para promover a interação social e comunicação com os pares. O principal objetivo desta dissertação é saber se o utilizador está distraído durante uma atividade. No futuro, o objetivo é fazer a interface deste sistema com o ZECA para, consequentemente, adaptar o seu comportamento tendo em conta o estado afetivo do utilizador durante uma atividade de imitação de emoções. A fim de reconhecer os comportamentos de distração humana e captar a atenção do utilizador, vários padrões de distração, bem como sistemas para detetá-los automaticamente, foram desenvolvidos. Um dos métodos de deteção de padrões de distração mais utilizados baseia-se na medição da orientação da cabeça e da orientação do olhar. A presente dissertação propõe um sistema baseado numa câmera Red Green Blue (RGB), capaz de detetar os padrões de distração, orientação da cabeça, orientação do olhar, frequência do piscar de olhos e a posição do utilizador em frente da câmera, durante uma atividade, e então classificar o estado do utilizador usando um algoritmo de “machine learning”. Por fim, o sistema proposto é avaliado num ambiente laboratorial, a fim de verificar se é capaz de detetar os padrões de distração. Os resultados destes testes preliminares permitiram detetar algumas restrições do sistema, bem como validar a sua adequação para posteriormente utilizá-lo num ambiente de intervenção

    Nyku: A Social Robot for Children With Autism Spectrum Disorders

    Get PDF
    The continued growth of Autism Spectrum Disorders (ASD) around the world has spurred a growth in new therapeutic methods to increase the positive outcomes of an ASD diagnosis. It has been agreed that the early detection and intervention of ASD disorders leads to greatly increased positive outcomes for individuals living with the disorders. Among these new therapeutic methods, Robot-Assisted Therapy (RAT) has become a hot area of study. Recent works have shown that high functioning ASD children have an affinity for interacting with robots versus humans. It is proposed that this is due to a less complex set of communication modes present in a robotic system as opposed to the complex non-verbal communications present in human to human interactions. As such, the Computer Vision and Robotics Lab at the University of Denver has embarked on developing a social robot for children with ASD. This thesis presents the design of this social robot; Nyku (Figure 1). It begins with an investigation of what the needs of ASD children are, what existing therapies help with, and what, if any, roles a robot can play in these treatment plans. From the literature examined, it is clear that robots designed specifically for ASD children have a core set of goals, despite the varied nature of the disorder\u27s spectrum. These goals aim to reduce the stress of non-verbal communications that may occur during standard therapies, as well as providing capabilities to reinforce typical areas of weakness in an ASD persons social repertoire, such as posture mimicry and eye contact. A goal of this thesis is to show the methodology behind arriving at these design goals so that future designers may follow and improve upon them. Nyku\u27s hardware and software design requirements draw from this foundation. Using this needs first design methodology allows for informed design such that the final product is actually useful to the ASD population. In this work, the information collected is used to design the mechanical components of Nyku. These elements consist of Nyku\u27s Body, Neck & Head, and Omni-wheel base. As with all robots, the mechanical needs then spawn electronics requirements, which are, in turn, presented. In order to tie these systems together, the control architecture is coded. Notably, this thesis results in a novel kinematic model of a spherical manipulation system present in the Omni-wheel Base. This solution is then presented in detail, along with the testing conducted to ensure the model\u27s accuracy. To complete the thesis, overall progress on Nyku is highlighted alongside suggestions for a continuation of the work. Here, the engineering work is compared against the design goals which it tries to fulfill in an effort to ensure that the work has stayed on track. In continuation, this examination maps out future steps needed to optimize the engineering work on Nyku for reliable performance during therapeutic sessions. Finally, a therapeutic plan is proposed given the hardware capabilities of Nyku and the needs of ASD children against the background of modern therapeutic methods
    corecore