52,762 research outputs found

    Motion and emotion estimation for robotic autism intervention.

    Get PDF
    Robots have recently emerged as a novel approach to treating autism spectrum disorder (ASD). A robot can be programmed to interact with children with ASD in order to reinforce positive social skills in a non-threatening environment. In prior work, robots were employed in interaction sessions with ASD children, but their sensory and learning abilities were limited, while a human therapist was heavily involved in “puppeteering” the robot. The objective of this work is to create the next-generation autism robot that includes several new interactive and decision-making capabilities that are not found in prior technology. Two of the main features that this robot would need to have is the ability to quantitatively estimate the patient’s motion performance and to correctly classify their emotions. This would allow for the potential diagnosis of autism and the ability to help autistic patients practice their skills. Therefore, in this thesis, we engineered components for a human-robot interaction system and confirmed them in experiments with the robots Baxter and Zeno, the sensors Empatica E4 and Kinect, and, finally, the open-source pose estimation software OpenPose. The Empatica E4 wristband is a wearable device that collects physiological measurements in real time from a test subject. Measurements were collected from ASD patients during human-robot interaction activities. Using this data and labels of attentiveness from a trained coder, a classifier was developed that provides a prediction of the patient’s level of engagement. The classifier outputs this prediction to a robot or supervising adult, allowing for decisions during intervention activities to keep the attention of the patient with autism. The CMU Perceptual Computing Lab’s OpenPose software package enables body, face, and hand tracking using an RGB camera (e.g., web camera) or an RGB-D camera (e.g., Microsoft Kinect). Integrating OpenPose with a robot allows the robot to collect information on user motion intent and perform motion imitation. In this work, we developed such a teleoperation interface with the Baxter robot. Finally, a novel algorithm, called Segment-based Online Dynamic Time Warping (SoDTW), and metric are proposed to help in the diagnosis of ASD. Social Robot Zeno, a childlike robot developed by Hanson Robotics, was used to test this algorithm and metric. Using the proposed algorithm, it is possible to classify a subject’s motion into different speeds or to use the resulting SoDTW score to evaluate the subject’s abilities

    Adapting a humanoid robot for use with children with profound and multiple disabilities

    Get PDF
    With all the developments in information technology (IT) for people with disabilities, few interventions have been designed for people with profound and multiple disabilities as there is little incentive for companies to design and manufacture technology purely for a group of consumers without much buying power. A possible solution is therefore to identify mainstream technology that, with adaptation, could serve the purposes required by those with profound and multiple disabilities. Because of its ability to engage the attention of young children with autism, the role of a humanoid robot was investigated. After viewing a demonstration, teachers of pupils with profound and multiple disabilities described actions they wished the robot to make in order to help nominated pupils to achieve learning objectives. They proposed a much wider range of suggestions for using the robot than it could currently provide. Adaptations they required fell into two groups: either increasing the methods through which the robot could be controlled or increasing the range of behaviours that the robot emitted. These were met in a variety of ways but most would require a degree of programming expertise above that possessed by most schoolteachers

    Robot-Mediated Interviews with Children : What do potential users think?

    Get PDF
    Luke Wood, Hagen Lehmann, Kerstin Dautenhahn, Ben Robins, Austen Rayner, and Dag Syrdal, ‘Robot-Mediated Interviews with Children: What do potential users think?’, paper presented at the 50th Annual Convention of the Society for the Study of Artificial Intelligence and the Simulation of Behaviour, 1 April 2014 – 4 April 2014, London, UK.When police officers are conducting interviews with children, some of the disclosures can be quite shocking. This can make it difficult for an officer to maintain their composure without subtly indicating their shock to the child, which can in turn impede the information acquisition process. Using a robotic interviewer could eliminate this problem as the behaviours and expressions of the robot can be consciously controlled. To date research investigating the potential of Robot-Mediated Interviews has focused on establishing whether children will respond to robots in an interview scenario and if so how well. The results of these studies indicate that children will talk to a robot in an interview scenario in a similar way to which they talk to a human interviewer. However, in order to test if this approach would work in a real world setting, it is important to establish what the experts (e.g. specialist child interviewers) would require from the system. To determine the needs of the users we conducted a user panel with a group of potential real world users to gather their views of our current system and find out what they would require for the system to be useful to them. The user group we worked with consisted of specialist child protection police officers based in the UK. The findings from this panel suggest that a Robot-Mediated Interviewing system would need to be more flexible than our current system in order to respond to unpredictable situations and paths of investigation. This paper gives an insight into what real world users would need from a Robot-Mediated Interviewing system

    Human-centred design methods : developing scenarios for robot assisted play informed by user panels and field trials

    Get PDF
    Original article can be found at: http://www.sciencedirect.com/ Copyright ElsevierThis article describes the user-centred development of play scenarios for robot assisted play, as part of the multidisciplinary IROMEC1 project that develops a novel robotic toy for children with special needs. The project investigates how robotic toys can become social mediators, encouraging children with special needs to discover a range of play styles, from solitary to collaborative play (with peers, carers/teachers, parents, etc.). This article explains the developmental process of constructing relevant play scenarios for children with different special needs. Results are presented from consultation with panel of experts (therapists, teachers, parents) who advised on the play needs for the various target user groups and who helped investigate how robotic toys could be used as a play tool to assist in the children’s development. Examples from experimental investigations are provided which have informed the development of scenarios throughout the design process. We conclude by pointing out the potential benefit of this work to a variety of research projects and applications involving human–robot interactions.Peer reviewe

    Physical extracurricular activities in educational child-robot interaction

    Get PDF
    In an exploratory study on educational child-robot interaction we investigate the effect of alternating a learning activity with an additional shared activity. Our aim is to enhance and enrich the relationship between child and robot by introducing "physical extracurricular activities". This enriched relationship might ultimately influence the way the child and robot interact with the learning material. We use qualitative measurement techniques to evaluate the effect of the additional activity on the child-robot relationship. We also explore how these metrics can be integrated in a highly exploratory cumulative score for the relationship between child and robot. This cumulative score suggests a difference in the overall child-robot relationship between children who engage in a physical extracurricular activity with the robot, and children who only engage in the learning activity with the robot.Comment: 5th International Symposium on New Frontiers in Human-Robot Interaction 2016 (arXiv:1602.05456

    A Pilot Study with a Novel Setup for Collaborative Play of the Humanoid Robot KASPAR with children with autism

    Get PDF
    This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.This article describes a pilot study in which a novel experimental setup, involving an autonomous humanoid robot, KASPAR, participating in a collaborative, dyadic video game, was implemented and tested with children with autism, all of whom had impairments in playing socially and communicating with others. The children alternated between playing the collaborative video game with a neurotypical adult and playing the same game with the humanoid robot, being exposed to each condition twice. The equipment and experimental setup were designed to observe whether the children would engage in more collaborative behaviours while playing the video game and interacting with the adult than performing the same activities with the humanoid robot. The article describes the development of the experimental setup and its first evaluation in a small-scale exploratory pilot study. The purpose of the study was to gain experience with the operational limits of the robot as well as the dyadic video game, to determine what changes should be made to the systems, and to gain experience with analyzing the data from this study in order to conduct a more extensive evaluation in the future. Based on our observations of the childrens’ experiences in playing the cooperative game, we determined that while the children enjoyed both playing the game and interacting with the robot, the game should be made simpler to play as well as more explicitly collaborative in its mechanics. Also, the robot should be more explicit in its speech as well as more structured in its interactions. Results show that the children found the activity to be more entertaining, appeared more engaged in playing, and displayed better collaborative behaviours with their partners (For the purposes of this article, ‘partner’ refers to the human/robotic agent which interacts with the children with autism. We are not using the term’s other meanings that refer to specific relationships or emotional involvement between two individuals.) in the second sessions of playing with human adults than during their first sessions. One way of explaining these findings is that the children’s intermediary play session with the humanoid robot impacted their subsequent play session with the human adult. However, another longer and more thorough study would have to be conducted in order to better re-interpret these findings. Furthermore, although the children with autism were more interested in and entertained by the robotic partner, the children showed more examples of collaborative play and cooperation while playing with the human adult.Peer reviewe

    Therapeutic and educational objectives in robot assisted play for children with autism

    Get PDF
    “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.” DOI: 10.1109/ROMAN.2009.5326251This article is a methodological paper that describes the therapeutic and educational objectives that were identified during the design process of a robot aimed at robot assisted play. The work described in this paper is part of the IROMEC project (Interactive Robotic Social Mediators as Companions) that recognizes the important role of play in child development and targets children who are prevented from or inhibited in playing. The project investigates the role of an interactive, autonomous robotic toy in therapy and education for children with special needs. This paper specifically addresses the therapeutic and educational objectives related to children with autism. In recent years, robots have already been used to teach basic social interaction skills to children with autism. The added value of the IROMEC robot is that play scenarios have been developed taking children's specific strengths and needs into consideration and covering a wide range of objectives in children's development areas (sensory, communicational and interaction, motor, cognitive and social and emotional). The paper describes children's developmental areas and illustrates how different experiences and interactions with the IROMEC robot are designed to target objectives in these areas

    Design Research on Robotic Products for School Environments

    Get PDF
    Advancements in robotic research have led to the design of a number of robotic products that can interact with people. In this research, a school environment was selected for a practical test of robotic products. For this, the robot “Tiro” was built, with the aim of supporting the learning activities of children. The possibility of applying robotic products was then tested through example lessons using Tiro. To do this, the robot design process and user-centred HRI evaluation framework were studied, and observations of robotic products were made via a field study on the basis of these understandings. Three different field studies were conducted, and interactions between children and robotic products were investigated. As a result, it was possible to understand how emotional interaction and verbal interaction affect the development of social relationships. Early results regarding this and coding schemes for video protocol analysis were gained. In this preliminary study, the findings are summarized and several design implications from insight grouping are suggested. These will help robot designers grasp how various factors of robotic products may be adopted in the everyday lives of people. Keywords: Robotic Products Design, HRI Evaluation, User-Centered HRI.</p

    Developing a protocol and experimental setup for using a humanoid robot to assist children with autism to develop visual perspective taking skills

    Get PDF
    Visual Perspective Taking (VPT) is the ability to see the world from another person's perspective, taking into account what they see and how they see it, drawing upon both spatial and social information. Children with autism often find it difficult to understand that other people might have perspectives, viewpoints, beliefs and knowledge that are different from their own, which is a fundamental aspect of VPT. In this research we aimed to develop a methodology to assist children with autism develop their VPT skills using a humanoid robot and present results from our first long-term pilot study. The games we devised were implemented with the Kaspar robot and, to our knowledge, this is the first attempt to improve the VPT skills of children with autism through playing and interacting with a humanoid robot. We describe in detail the standard pre- and post- assessments that we performed with the children in order to measure their progress and also the inclusion criteria derived from the results for future studies in this field. Our findings suggest that some children may benefit from this approach of learning about VPT, which shows that this approach merits further investigation.Peer reviewe
    • 

    corecore