15 research outputs found

    Assistive robotics: research challenges and ethics education initiatives

    Get PDF
    Assistive robotics is a fast growing field aimed at helping healthcarers in hospitals, rehabilitation centers and nursery homes, as well as empowering people with reduced mobility at home, so that they can autonomously fulfill their daily living activities. The need to function in dynamic human-centered environments poses new research challenges: robotic assistants need to have friendly interfaces, be highly adaptable and customizable, very compliant and intrinsically safe to people, as well as able to handle deformable materials. Besides technical challenges, assistive robotics raises also ethical defies, which have led to the emergence of a new discipline: Roboethics. Several institutions are developing regulations and standards, and many ethics education initiatives include contents on human-robot interaction and human dignity in assistive situations. In this paper, the state of the art in assistive robotics is briefly reviewed, and educational materials from a university course on Ethics in Social Robotics and AI focusing on the assistive context are presented.Peer ReviewedPostprint (author's final draft

    Social robots in educational contexts: developing an application in enactive didactics

    Get PDF
    Due to advancements in sensor and actuator technology robots are becoming more and more common in everyday life. Many of the areas in which they are introduced demand close physical and social contact. In the last ten years the use of robots has also increasingly spread to the field of didactics, starting with their use as tools in STEM education. With the advancement of social robotics, the use of robots in didactics has been extended also to tutoring situations in which these \u201csocially aware\u201d robots interact with mainly children in, for example, language learning classes. In this paper we will give a brief overview of how robots have been used in this kind of settings until now. As a result it will become transparent that the majority of applications are not grounded in didactic theory. Recognizing this shortcoming, we propose a theory driven approach to the use of educational robots, centred on the idea that the combination of enactive didactics and social robotics holds great promises for a variety of tutoring activities in educational contexts. After defining our \u201cEnactive Robot Assisted Didactics\u201d approach, we will give an outlook on how the use of humanoid robots can advance it. On this basis, at the end of the paper, we will describe a concrete, currently on-going implementation of this approach, which we are realizing with the use of Softbank Robotics\u2019 Pepper robot during university lectures

    A framework for using humanoid robots in the school learning environment

    Get PDF
    With predictions of robotics and efficient machine learning being the building blocks of the Fourth Industrial Revolution, countries need to adopt a long-term strategy to deal with potential challenges of automation and education must be at the center of this long-term strategy. Education must provide students with a grounding in certain skills, such as computational thinking and an understanding of robotics, which are likely to be required in many future roles. Targeting an acknowledged gap in existing humanoid robot research in the school learning environment, we present a multidisciplinary framework that integrates the following four perspectives: technological, pedagogical, efficacy of humanoid robots and a consideration of the ethical implications of using humanoid robots. Further, this paper presents a proposed application, evaluation and a case study of how the framework can be used.publishedVersio

    Socially Assistive Robot Enabled Home-Based Care for Supporting People with Autism

    Get PDF
    The growing number of people diagnosed with Autism Spectrum Disorder (ASD) is an issue of concern in Australia and many countries. In order to improve the engagement, reciprocity, productivity and usefulness of people with ASD in a home-based environment, in this paper the authors report on a 9 month Australian home-based care trial of socially assistive robot (Lucy) to support two young adults with autism. This work demonstrates that by marrying personhood (of people with ASD) with human-like communication modalities of Lucy potentially positive outcomes can be achieved in terms of engagement, productivity and usefulness as well as reciprocity of the people with ASD. Lucy also provide respite to their carers (e.g., parents) in their day to day living

    Human-centred design methods : developing scenarios for robot assisted play informed by user panels and field trials

    Get PDF
    Original article can be found at: http://www.sciencedirect.com/ Copyright ElsevierThis article describes the user-centred development of play scenarios for robot assisted play, as part of the multidisciplinary IROMEC1 project that develops a novel robotic toy for children with special needs. The project investigates how robotic toys can become social mediators, encouraging children with special needs to discover a range of play styles, from solitary to collaborative play (with peers, carers/teachers, parents, etc.). This article explains the developmental process of constructing relevant play scenarios for children with different special needs. Results are presented from consultation with panel of experts (therapists, teachers, parents) who advised on the play needs for the various target user groups and who helped investigate how robotic toys could be used as a play tool to assist in the children’s development. Examples from experimental investigations are provided which have informed the development of scenarios throughout the design process. We conclude by pointing out the potential benefit of this work to a variety of research projects and applications involving human–robot interactions.Peer reviewe

    Action estimation using a theory of mind as applied on the humanoid robot SURALP

    Get PDF
    Explanations regarding human consciousness have existed for a very long time. Theory of Mind (ToM) is one of the contemporary explanations for counciousness. This theory states that humans have functionalized brain parts for understanding beliefs and intentions of others. Humans have an inherent ability for making inferences on visual data once an acition is observed. Understanding/anticipating human actions based on visual data can be explained in context of ToM. It is proposed that a functionalized brain part is used for estimating intentions of others from observed movements of an actor. This functionalized part posses a Forward Model (FM) which simulates consequences of intentions. Simulated intentions are compared with observed movements to estimate the action of the actor. This thesis is based on implementation of such an action estimation model on a humanoid robot platform. A computational model for the part of the human brain which estimates intentions is needed to implement the model on a robotic platform. There is a proposed computational model in the literature for the part of the brain which estimates intentions. Model explains how a FM can be used along with a loop for action estimation by providing an algorithm. Motivation for such an implementation has two main reasons: To program a humanoid robot platform in such a way that it anticipates movements of the human actor to assist him/her, and a platform which can test ToM related to action estimation. In thesis the implementation is made on SURALP (Sabanci University ReseArch Labaratory Platform). Kinect is used for visual data input device. Various tests, which observe capabilities and limitations of the computational model, are completed with success

    How technology applied to music-therapy and sound-based activities addresses motor and social skills in autistic children

    Get PDF
    Autism affects how people perceive and make sense of the world around them. Autism is a spectrum condition which impacts people in different ways. Also referred to as Autism Spectrum Disorder (ASD), it is characterized by challenges in the domains of social, cognitive and motor functioning, which differ in severity. Previous research suggests that music can have cognitive, psychosocial, behavioural, and motor benefits in this population. We systematically review the use of technology in Music-therapy and related sound-based activities to improve the motor and social skills of children. In May 2020 we conducted a systematic search on Music-therapy and musical activities for autistic children in research databases including Science Direct, APA PsycNet, Cochrane, IEE and Web of Science, to collect relevant studies. We initially collected 5179 papers of which only 27 studies were identified as suitable for the scope of this review. In the paper, we analyse and describe key characteristics of each project. We then highlight the commonalities, strengths and limitations of existing work, and identify implications for future interaction design

    Studying Eye Gaze of Children with Autism Spectrum Disorders in Interaction with a Social Robot

    Get PDF
    Children with Autism Spectrum Disorders (ASDs) experience deficits in verbal and nonverbal communication skills including motor control, emotional facial expressions, and eye gaze attention. In this thesis, we focus on studying the feasibility and effectiveness of using a social robot, called NAO, at modeling and improving the social responses and behaviors of children with autism. In our investigation, we designed and developed two protocols to fulfill this mission. Since eye contact and gaze responses are important non-verbal cues in human\u27s social communication and as the majority of individuals with ASD have difficulties regulating their gaze responses, in this thesis we have mostly focused on this area. In Protocol 1 (eye gaze duration and shifting frequency are analyzed in this protocol), we designed two social games (i.e. NAO Spy and Find the Suspect) and recruited 21 subjects (i.e. 14 ASD and seven Typically Developing (TD) children) ages between 7-17 years old to interact with NAO. All sessions were recorded using cameras and the videos were used for analysis. In particular, we manually annotated the eye gaze direction of children (i.e. gaze averted `0\u27 or gaze at robot `1\u27) in every frame of the videos within two social contexts (i.e. child speaking and child listening). Gaze fixation and gaze shifting frequency are analyzed, where both patterns are significantly improved or changed (more than half of the participants increased the eye contact duration time and decrease the eye gaze shifting during both games). The results confirms that the TD group has more gaze fixation as they are listening (71%) than while they are speaking (37%). However there is no significant difference between the average gaze fixations of ASD group. Besides using the statistical measures (i.e. gaze fixation and shifting), we statistically modeled the gaze responses of both groups (TD and ASD) using Markov models (e.g. Hidden Markov Model (HMM) and Variable-order Markov Model (VMM)). Using Markov based modeling allows us to analyze the sequence of gaze direction of ASD and TD groups for two social conversational sessions (Child Speaking and Listening). The results of our experiments show that for the `Child Speaking\u27 segments, HMM can distinguish and recognize the differences of gaze patterns of TD and ASD groups accurately (79%). In addition, to evaluate the effect of history of eye gaze in the gaze responses, the VMM technique was employed to model the effects of different length of sequential data. The results of VMM demonstrate that, in general, the first order system (VMM with order D=1) can reliably represent the differences between the gaze patterns of TD and ASD group. Besides that, the experimental results confirm that VMM is more reliable and accurate for modeling the gaze responses of Child Listening sessions than the Child Speaking one. Protocol 2 contains five sub-sessions targeted intervention of different social skills: verbal communication, joint attention, eye gaze attention, facial expressions recognition/imitation. The objective of this protocol is to provide intervention sessions based on the needs of children diagnosed with ASD. Therefore each participant attended in three times of baseline sessions for evaluate his/her existing social skill and behavioral response, when the study began. In this protocol the behavioral responses of every child is recorded in each intervention session where feedbacks are focused on improving their social skills if they lack one. For example if they are not good at recognizing facial expression, we give them feedback on how every facial expression looks like and ask them to recognize them correctly while we do not feedback on other social skills. Our experimental results show that customizing the human-robot interaction would improve the social skills of children with ASD

    Nyku: A Social Robot for Children With Autism Spectrum Disorders

    Get PDF
    The continued growth of Autism Spectrum Disorders (ASD) around the world has spurred a growth in new therapeutic methods to increase the positive outcomes of an ASD diagnosis. It has been agreed that the early detection and intervention of ASD disorders leads to greatly increased positive outcomes for individuals living with the disorders. Among these new therapeutic methods, Robot-Assisted Therapy (RAT) has become a hot area of study. Recent works have shown that high functioning ASD children have an affinity for interacting with robots versus humans. It is proposed that this is due to a less complex set of communication modes present in a robotic system as opposed to the complex non-verbal communications present in human to human interactions. As such, the Computer Vision and Robotics Lab at the University of Denver has embarked on developing a social robot for children with ASD. This thesis presents the design of this social robot; Nyku (Figure 1). It begins with an investigation of what the needs of ASD children are, what existing therapies help with, and what, if any, roles a robot can play in these treatment plans. From the literature examined, it is clear that robots designed specifically for ASD children have a core set of goals, despite the varied nature of the disorder\u27s spectrum. These goals aim to reduce the stress of non-verbal communications that may occur during standard therapies, as well as providing capabilities to reinforce typical areas of weakness in an ASD persons social repertoire, such as posture mimicry and eye contact. A goal of this thesis is to show the methodology behind arriving at these design goals so that future designers may follow and improve upon them. Nyku\u27s hardware and software design requirements draw from this foundation. Using this needs first design methodology allows for informed design such that the final product is actually useful to the ASD population. In this work, the information collected is used to design the mechanical components of Nyku. These elements consist of Nyku\u27s Body, Neck & Head, and Omni-wheel base. As with all robots, the mechanical needs then spawn electronics requirements, which are, in turn, presented. In order to tie these systems together, the control architecture is coded. Notably, this thesis results in a novel kinematic model of a spherical manipulation system present in the Omni-wheel Base. This solution is then presented in detail, along with the testing conducted to ensure the model\u27s accuracy. To complete the thesis, overall progress on Nyku is highlighted alongside suggestions for a continuation of the work. Here, the engineering work is compared against the design goals which it tries to fulfill in an effort to ensure that the work has stayed on track. In continuation, this examination maps out future steps needed to optimize the engineering work on Nyku for reliable performance during therapeutic sessions. Finally, a therapeutic plan is proposed given the hardware capabilities of Nyku and the needs of ASD children against the background of modern therapeutic methods

    Charlie: A New Robot Prototype for Improving Communication and social Skills in Children with Autism and a New Single-point Infrared Sensor Technique for Detecting bBeathing and Heart Rate Remotely

    Get PDF
    This research delivers a new, interactive game-playing robot named CHARLIE and a novel technique for remotely detecting breathing and heart rate using a single-point, thermal infrared sensor (IR). The robot is equipped with a head and two arms, each with two degrees of freedom, and a camera. We trained a human hands classifier and used this classifier along with a standard face classifier to create two autonomous interactive games: single-player ( Imitate Me, Imitate You ) and two-player ( Pass the Pose ). Further, we developed and implemented a suite of new interactive games in which the robot is teleoperated by remote control. Each of these features has been tested and validated through a field study including eight children diagnosed with autism and speech delays. Results from that study show that significant improvements in speech and social skills can be obtained when using CHARLIE with the methodology described herein. Moreover, gains in communication and social interaction are observed to generalize from child-to-robot to co-present others through the scaffolding of communication skills with the systematic approach developed for the study. Additionally, we present a new IR system that continuously targets the sub-nasal region of the face and measures subtle temperature changes corresponding to breathing and cardiac pulse. This research makes four novel contributions: (1) A low-cost, field-tested robot for use in autism therapy, (2) a suite of interactive robot games, (3) a hand classifier created for performing hand detection during the interactive games, and (4) an IR sensor system which remotely collects temperatures and computes breathing and heart rate. Interactive robot CHARLIE is physically designed to be aesthetically appealing to young children between three and six years of age. The hard, wood and metal robot body is covered with a bright green, fuzzy material and additional padding so that it appears toylike and soft. Additionally, several structural features were included to ensure safety during interactive play and to enhance the robustness of the robot. Because children with autism spectrum disorder (ASD) often enjoy exploring new or interesting objects with their hands, the robot must be able to withstand a moderate amount of physical manipulation without causing injury to the child or damaging the robot or its components. CHARLIE plays five distinct interactive games that are designed to be entertaining to young children, appeal to children of varying developmental ability and promote increased speech and social skill through imitation and turn-taking. Remote breathing and heart rate detection Stress is a compounding factor in autism therapy which can inhibit progress toward specific therapeutic goals. The ability to non-invasively detect physical indicators of increasing stress, especially when they can be correlated to specific activities and measured in terms of length and frequency, can relay important metrics about the antecedents that cause stress for a particular child and can be used to help automate the evaluation of a child\u27s progress between sessions. Further, collecting and measuring critical physiological indicators such as breathing and heart rate can enable robots to adjust their behavior based on the perceived emotional, psychological or physical state of their user. The utility and acceptance of robots can be further increased when they are able to learn typical physiological patterns and use these patterns as a baseline for identifying anomalies or possible warning signs of various problems in their human users. We present a new technique for remotely collecting and analyzing breathing and heart rates in real time using an autonomous, low cost infrared (IR) sensor system. This is accomplished by continuously targeting a high precision IR sensor, tracking changes in the sub-nasal skin surface temperature and employing a sinusoidal curve-fitting function, Fast Fourier Transform (FFT), and Discrete Wavelet Transform (DWT) to extract the breathing and heart rate from recorded temperatures
    corecore