299 research outputs found

    Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot

    Get PDF
    Original article can be found at: http://ieeexplore.ieee.org “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”This paper presents results from a video human-robot interaction (VHRI) study in which participants viewed a video in which an appearance-constrained Pioneer robot used dog-inspired affective cues to communicate affinity and relationship with its owner and a guest using proxemics, body movement and orientation and camera orientation. The findings suggest that even with the limited modalities for non-verbal expression offered by a Pioneer robot, which does not have a dog-like appearance, these cues were effective for non-verbal affective communication

    Developing an engagement and social interaction model for a robotic educational agent

    Get PDF
    Effective educational agents should accomplish four essential goals during a student's learning process: 1) monitor engagement, 2) re-engage when appropriate, 3) teach novel tasks, and 4) improve retention. In this dissertation, we focus on all of these objectives through use of a teaching device (computer, tablet, or virtual reality game) and a robotic educational agent. We begin by developing and validating an engagement model based on the interactions between the student and the teaching device. This model uses time, performance, and/or eye gaze to determine the student's level of engagement. We then create a framework for implementing verbal and nonverbal, or gestural, behaviors on a humanoid robot and evaluate its perception and effectiveness for social interaction. These verbal and nonverbal behaviors are applied throughout the learning scenario to re-engage the students when the engagement model deems it necessary. Finally, we describe and validate the entire educational system that uses the engagement model to activate the behavioral strategies embedded on the robot when learning a new task. We then follow-up this study to evaluate student retention when using this system. The outcome of this research is the development of an educational system that effectively monitors student engagement, applies behavioral strategies, teaches novel tasks, and improves student retention to achieve individualized learning.Ph.D

    User Experience Design and Evaluation of Persuasive Social Robot As Language Tutor At University : Design And Learning Experiences From Design Research

    Get PDF
    Human Robot Interaction (HRI) is a developing field where research and innovation are progressing. One domain where Human Robot Interaction has focused is in the educational sector. Various research has been conducted in education field to design social robots with appropriate design guidelines derived from user preferences, context, and technology to help students and teachers to foster their learning and teaching experience. Language learning has become popular in education due to students receiving opportunities to study and learn any interested subjects in any language in their preferred universities around the world. Thus, being the reason behind the research of using social robots in language learning and teaching in education field. To this context this thesis explored the design of language tutoring robot for students learning Finnish language at university. In language learning, motivation, the learning experience, context, and user preferences are important to be considered. This thesis focuses on the Finnish language learning students through language tutoring social robot at Tampere University. The design research methodology is used to design the persuasive language tutoring social robot teaching Finnish language to the international students at Tampere University. The design guidelines and the future language tutoring robot design with their benefits are formed using Design Research methodology. Elias Robot, a language tutoring application designed by Curious Technologies, Finnish EdTech company was used in the explorative user study. The user study involved Pepper, Social robot along with the Elias robot application using Mobile device technology. The user study was conducted in university, the students include three male participants and four female participants. The aim of the study was to gather the design requirements based on learning experiences from social robot tutor. Based on this study findings and the design research findings, the future language tutoring social robot was co-created through co design workshop. Based on the findings from Field study, user study, technology acceptance model findings, design research findings, student interviews, the persuasive social robot language tutor was designed. The findings revealed all the multi modalities are required for the efficient tutoring of persuasive social robots and the social robots persuade motivation with students to learn the language. The design implications were discussed, and the design of social robot tutor are created through design scenarios

    Conveying Audience Emotions through Humanoid Robot Gestures to an Orchestra during a Live Musical Exhibition

    Get PDF
    In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations

    Robot Assistive Therapy Strategies for Children with Autism

    Get PDF
    Background: Autism spectrum disorder (ASD) is a category of neurodevelopmental disorder characterized by persistent deficits in social communication and social interaction across multiple contexts as well as restricted, repetitive patterns of behaviour, interests, or activities. Social robots offer clinicians new ways to interact and work with people with ASD. Robot-Assisted Training (RAT) is a growing body of research in HRI, which studies how robots can assist and enhance human skills during a task-centred interaction. RAT systems have a wide range of application for children with ASD. Aims: In a pilot RCT with an experimental group and a control group, research aims will be: to assess group differences in repetitive and maladaptive behaviours (RMBs), affective states and performance tasks across sessions and within each group; to assess the perception of family relationships between two groups before and post robot interaction; to develop a robotic app capable to run Raven’s Progressive Matrices (RPM), a test typically used to measure general human intelligence and to compare the accuracy of the robot to capture the data with that run by psychologists. Material and Methods: Patients with mild or moderate level of ASD will be enrolled in the study which will last 3 years. The sample size is: 60 patients (30 patients will be located in the experimental group and 30 patients will be located in the control group) indicated by an evaluation of the estimated enrolment time. Inclusion criteria will be the following: eligibility of children confirmed using the Autism Diagnostic Observation Schedule −2; age ≄ 7 years; clinician judgment during a clinical psychology evaluation; written parental consent approved by the local ethical committee. The study will be conducted over 10 weeks for each participant, with the pretest and post test conducted during the first and last weeks of the study. The training will be provided over the intermediate eight weeks, with one session provided each week, for a total of 8 sessions. Baseline and follow-up evaluation include: socioeconomic status of families will be assessed using the Hollingshead scale; Social Communication Questionnaire (SCQ) will be used to screen the communication skills and social functioning in children with ASD; Vineland Adaptive Behavior Scale, 2nd edition (VABS) will be used to assess the capabilities of children in dealing with everyday life; severity and variety of children’s ripetitive behaviours will be also assessed using Repetitive Behavior Scale-Revised (RBS-R). Moreover, the perception of family relationships assessment will be run by Portfolio for the validation of parental acceptance and refusal (PARENTS). Expected Results: 1) improbe communication skills; 2) reduced repetitive and maladaptive behaviors; 3) more positive perception of family relationships; 4) improved performance. Conclusions: Robot-Assisted Training aims to train and enhance user (physical or cognitive) skills, through the interaction, and not assist users to complete a task thus a target is to enhance user performance by providing personalized and targeted assistance towards maximizing training and learning effects. Robotics systems can be used to manage therapy sessions, gather and analyse data and like interactions with the patient and generate useful information in the form of reports and graphs, thus are a powerful tool for the therapist to check patient’s progress and facilitate diagnosis

    Conversational affective social robots for ageing and dementia support

    Get PDF
    Socially assistive robots (SAR) hold significant potential to assist older adults and people with dementia in human engagement and clinical contexts by supporting mental health and independence at home. While SAR research has recently experienced prolific growth, long-term trust, clinical translation and patient benefit remain immature. Affective human-robot interactions are unresolved and the deployment of robots with conversational abilities is fundamental for robustness and humanrobot engagement. In this paper, we review the state of the art within the past two decades, design trends, and current applications of conversational affective SAR for ageing and dementia support. A horizon scanning of AI voice technology for healthcare, including ubiquitous smart speakers, is further introduced to address current gaps inhibiting home use. We discuss the role of user-centred approaches in the design of voice systems, including the capacity to handle communication breakdowns for effective use by target populations. We summarise the state of development in interactions using speech and natural language processing, which forms a baseline for longitudinal health monitoring and cognitive assessment. Drawing from this foundation, we identify open challenges and propose future directions to advance conversational affective social robots for: 1) user engagement, 2) deployment in real-world settings, and 3) clinical translation

    Applications of Robotics for Autism Spectrum Disorder: a Scoping Review

    Get PDF
    Robotic therapies are receiving growing interest in the autism field, especially for the improvement of social skills of children, enhancing traditional human interventions. In this work, we conduct a scoping review of the literature in robotics for autism, providing the largest review on this field from the last five years. Our work underlines the need to better characterize participants and to increase the sample size. It is also important to develop homogeneous training protocols to analyse and compare the results. Nevertheless, 7 out of the 10 Randomized control trials reported a significant impact of robotic therapy. Overall, robot autonomy, adaptability and personalization as well as more standardized outcome measures were pointed as the most critical issues to address in future research

    The use of social robots with children and young people on the autism spectrum: A systematic review and meta-analysis

    Get PDF
    © 2022 Kouroupa et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, https://creativecommons.org/licenses/by/4.0/Background: Robot-mediated interventions show promise in supporting the development of children on the autism spectrum. Objectives: In this systematic review and meta-analysis, we summarize key features of available evidence on robot-interventions for children and young people on the autism spectrum aged up to 18 years old, as well as consider their efficacy for specific domains of learning. Data sources: PubMed, Scopus, EBSCOhost, Google Scholar, Cochrane Library, ACM Digital Library, and IEEE Xplore. Grey literature was also searched using PsycExtra, OpenGrey, British Library EThOS, and the British Library Catalogue. Databases were searched from inception until April (6th) 2021. Synthesis methods: Searches undertaken across seven databases yielded 2145 articles. Forty studies met our review inclusion criteria of which 17 were randomized control trials. The methodological quality of studies was conducted with the Quality Assessment Tool for Quantitative Studies. A narrative synthesis summarised the findings. A meta-analysis was conducted with 12 RCTs. Results: Most interventions used humanoid (67%) robotic platforms, were predominantly based in clinics (37%) followed home, schools and laboratory (17% respectively) environments and targeted at improving social and communication skills (77%). Focusing on the most common outcomes, a random effects meta-analysis of RCTs showed that robot-mediated interventions significantly improved social functioning (g = 0.35 [95%CI 0.09 to 0.61; k = 7). By contrast, robots did not improve emotional (g = 0.63 [95%CI -1.43 to 2.69]; k = 2) or motor outcomes (g = -0.10 [95%CI -1.08 to 0.89]; k = 3), but the numbers of trials were very small. Meta-regression revealed that age accounted for almost one-third of the variance in effect sizes, with greater benefits being found in younger children. Conclusions: Overall, our findings support the use of robot-mediated interventions for autistic children and youth, and we propose several recommendations for future research to aid learning and enhance implementation in everyday settings. PROSPERO registration: Our methods were preregistered in the PROSPERO database (CRD42019148981).Peer reviewe

    Gestures in human-robot interaction

    Get PDF
    Gesten sind ein Kommunikationsweg, der einem Betrachter Informationen oder Absichten ĂŒbermittelt. Daher können sie effektiv in der Mensch-Roboter-Interaktion, oder in der Mensch-Maschine-Interaktion allgemein, verwendet werden. Sie stellen eine Möglichkeit fĂŒr einen Roboter oder eine Maschine dar, um eine Bedeutung abzuleiten. Um Gesten intuitiv benutzen zukönnen und Gesten, die von Robotern ausgefĂŒhrt werden, zu verstehen, ist es notwendig, Zuordnungen zwischen Gesten und den damit verbundenen Bedeutungen zu definieren -- ein Gestenvokabular. Ein Menschgestenvokabular definiert welche Gesten ein Personenkreis intuitiv verwendet, um Informationen zu ĂŒbermitteln. Ein Robotergestenvokabular zeigt welche Robotergesten zu welcher Bedeutung passen. Ihre effektive und intuitive Benutzung hĂ€ngt von Gestenerkennung ab, das heißt von der Klassifizierung der Körperbewegung in diskrete Gestenklassen durch die Verwendung von Mustererkennung und maschinellem Lernen. Die vorliegende Dissertation befasst sich mit beiden Forschungsbereichen. Als eine Voraussetzung fĂŒr die intuitive Mensch-Roboter-Interaktion wird zunĂ€chst ein Aufmerksamkeitsmodell fĂŒr humanoide Roboter entwickelt. Danach wird ein Verfahren fĂŒr die Festlegung von Gestenvokabulare vorgelegt, das auf Beobachtungen von Benutzern und Umfragen beruht. Anschliessend werden experimentelle Ergebnisse vorgestellt. Eine Methode zur Verfeinerung der Robotergesten wird entwickelt, die auf interaktiven genetischen Algorithmen basiert. Ein robuster und performanter Gestenerkennungsalgorithmus wird entwickelt, der auf Dynamic Time Warping basiert, und sich durch die Verwendung von One-Shot-Learning auszeichnet, das heißt durch die Verwendung einer geringen Anzahl von Trainingsgesten. Der Algorithmus kann in realen Szenarien verwendet werden, womit er den Einfluss von Umweltbedingungen und Gesteneigenschaften, senkt. Schließlich wird eine Methode fĂŒr das Lernen der Beziehungen zwischen Selbstbewegung und Zeigegesten vorgestellt.Gestures consist of movements of body parts and are a mean of communication that conveys information or intentions to an observer. Therefore, they can be effectively used in human-robot interaction, or in general in human-machine interaction, as a way for a robot or a machine to infer a meaning. In order for people to intuitively use gestures and understand robot gestures, it is necessary to define mappings between gestures and their associated meanings -- a gesture vocabulary. Human gesture vocabulary defines which gestures a group of people would intuitively use to convey information, while robot gesture vocabulary displays which robot gestures are deemed as fitting for a particular meaning. Effective use of vocabularies depends on techniques for gesture recognition, which considers classification of body motion into discrete gesture classes, relying on pattern recognition and machine learning. This thesis addresses both research areas, presenting development of gesture vocabularies as well as gesture recognition techniques, focusing on hand and arm gestures. Attentional models for humanoid robots were developed as a prerequisite for human-robot interaction and a precursor to gesture recognition. A method for defining gesture vocabularies for humans and robots, based on user observations and surveys, is explained and experimental results are presented. As a result of the robot gesture vocabulary experiment, an evolutionary-based approach for refinement of robot gestures is introduced, based on interactive genetic algorithms. A robust and well-performing gesture recognition algorithm based on dynamic time warping has been developed. Most importantly, it employs one-shot learning, meaning that it can be trained using a low number of training samples and employed in real-life scenarios, lowering the effect of environmental constraints and gesture features. Finally, an approach for learning a relation between self-motion and pointing gestures is presented
    • 

    corecore