40 research outputs found

    Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot

    Get PDF
    Original article can be found at: http://ieeexplore.ieee.org “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”This paper presents results from a video human-robot interaction (VHRI) study in which participants viewed a video in which an appearance-constrained Pioneer robot used dog-inspired affective cues to communicate affinity and relationship with its owner and a guest using proxemics, body movement and orientation and camera orientation. The findings suggest that even with the limited modalities for non-verbal expression offered by a Pioneer robot, which does not have a dog-like appearance, these cues were effective for non-verbal affective communication

    Constraints in the design of activities focusing on emotion recognition for children with ASD using robotic tools

    Get PDF
    Robótica-Autismo project, presented in this paper, aims to identify the main aspects to be considered when working with robots and children with ASD (Autism Spectrum Disorders). Several constraints are identified such as the type of robot, the type of skills that should be developed, the criteria of inclusion and exclusion in the target group, which proceedings should be followed during the sessions and how to analyze the obtained results. In the end, a well-established methodology is achieved in order to accomplish the goal of using a robot as a mediator between children with ASD and other human partners.The authors are grateful to the Portuguese Foundation for Science and Technology, FCT - Fundacao para a Ciencia e a Tecnologia, for funding through the R& D project reference RIPD/ADAlI09407/2009 and the scholarship SFRHIBD/71600/2010. This work is also supported by a QREN initiative, from UEIFEDER ( Fundo Europeu de Desenvolvimento Regional) funds through the "Programa Operacional Factores de Competitividade - COMPETE"

    What could assistance robots learn from assistance dogs?

    Get PDF
    These studies are part of our broader project that aims at revealing relevant aspects of human-dog interactions, which could help to develop and test robot social behaviour. We suggest that the cooperation between assistance dogs and their disabled owners could serve as a model to design successful assistance robot–human interactions. In Study 1, we analysed the behaviour of 32 assistance dog–owner dyads performing a fetch and carry task. In addition to important typical behaviours (attracting attention, eye-contact, comprehending pointing gestures), we found differences depending on how experienced the dyad was and whether the owner used a wheel chair or not. In Study 2 we investigated the reactions of a subsample of dogs to unforeseen difficulties during a retrieving task. We revealed different types of communicative and displacement behaviours, and importantly, dogs showed a strong commitment to execute the insoluble task or at least their behaviours lent a “busy” appearance to them, which can attenuate the owners’ disappointment. We suggest that assistant robots should communicate their inability to solve a problem using simple behaviours (non-verbal vocalisation, orientation alternation), and/or could show displacement behaviours rather than simply not performing the task. In sum, we propose that assistant dogs’ communicative behaviours and problem solving strategies could inspire the development of the relevant functions and social behaviours of assistance robots

    Assistive Robot with an AI-Based Application for the Reinforcement of Activities of Daily Living: Technical Validation with Users Affected by Neurodevelopmental Disorders

    Get PDF
    In this work, we propose the first study of a technical validation of an assistive robotic platform, which has been designed to assist people with neurodevelopmental disorders. The platform is called LOLA2 and it is equipped with an artificial intelligence-based application to reinforce the learning of daily life activities in people with neurodevelopmental problems. LOLA2 has been integrated with an ROS-based navigation system and a user interface for healthcare professionals and their patients to interact with it. Technically, we have been able to embed all these modules into an NVIDIA Jetson Xavier board, as well as an artificial intelligence agent for online action detection (OAD). This OAD approach provides a detailed report on the degree of performance of a set of daily life activities that are being learned or reinforced by users. All the human–robot interaction process to work with users with neurodevelopmental disorders has been designed by a multidisciplinary team. Among its main features are the ability to control the robot with a joystick, a graphical user interface application that shows video tutorials with the activities to reinforce or learn, and the ability to monitor the progress of the users as they complete tasks. The main objective of the assistive robotic platform LOLA2 is to provide a system that allows therapists to track how well the users understand and perform daily tasks. This paper focuses on the technical validation of the proposed platform and its application. To do so, we have carried out a set of tests with four users with neurodevelopmental problems and special physical conditions under the supervision of the corresponding therapeutic personnel. We present detailed results of all interventions with end users, analyzing the usability, effectiveness, and limitations of the proposed technology. During its initial technical validation with real users, LOLA2 was able to detect the actions of users with disabilities with high precision. It was able to distinguish four assigned daily actions with high accuracy, but some actions were more challenging due to the physical limitations of the users. Generally, the presence of the robot in the therapy sessions received excellent feedback from medical professionals as well as patients. Overall, this study demonstrates that our developed robot is capable of assisting and monitoring people with neurodevelopmental disorders in performing their daily living tasks.This research was funded by project AIRPLANE, with reference PID2019-104323RB-C31, of Spain’s Ministry of Science and Innovation

    Socially assistive robotics: robot exercise trainer for older adults

    Get PDF
    Physical activities have tremendous benefit to older adults. A report from the World Health Organization has mentioned that lack of physical activity contributed to around 3.2 million premature deaths annually worldwide. Research also shows that regular exercise helps the older adults by improving their physical fitness, immune system, sleep and stress levels, not to mention the countless health problems it reduces such as diabetes, cardiovascular disease, dementia, obesity, joint pains, etc. The research reported in this paper is introducing a Socially Assistive Robot (SAR) that will engage, coach, assess and motivate the older adults in physical exercises that are recommended by the National Health Services (NHS) in the UK. With the rise in the population of older adults, which is expected to triple by 2050, this SAR will aim to improve the quality of life for a significant proportion of the population. To assess the proposed robot exercise trainer, user’s observational evaluation with 17 participants is conducted. Participants are generally happy with the proposed platform as a mean of encouraging them to do regular exercise correctly

    Interactive Robots: Therapy Robots

    Get PDF
    Robots are becoming increasingly common in many areas of human life as technology advances. Considering the usage areas, robots appear in a wide range, from entertainment to psychotherapy. In addition to its role in facilitating human life, its use in the health field has recently been quite remarkable. In this study, interactive robots are evaluated in general and their use in the mental health field is discussed on a large scale. Accordingly, the primary purpose of this study is to examine the need for the development of interactive and therapy robots, their areas of use, and studies on their effectiveness as well as therapy robots that are generally accepted in the relevant literature. The results of the examination show that interactive robots are classified into six groups: social, entertainment, educational, rehabilitation, sex, and therapy robots. In the related literature, Eliza, Woebot, Youper, Wysa, Simsensei Kiosk, Paro, NeCoRo, Kaspar, Bandit, and Pepper have generally been accepted as therapy robots. The results of the studies demonstrate the effectiveness and the usage of interactive therapy robots in therapy for different groups and needs, especially for disadvantaged individuals. On the other hand, it is considered that more research on the effectiveness of robots is needed. Considering the effects on mental health and quality of life, it is believed that the usage of robots in therapy is important and its widespread use will have a significant positive effect in the field

    CLARA: Building a Socially Assistive Robot to Interact with Elderly People

    Get PDF
    Although the global population is aging, the proportion of potential caregivers is not keeping pace. It is necessary for society to adapt to this demographic change, and new technologies are a powerful resource for achieving this. New tools and devices can help to ease independent living and alleviate the workload of caregivers. Among them, socially assistive robots (SARs), which assist people with social interactions, are an interesting tool for caregivers thanks to their proactivity, autonomy, interaction capabilities, and adaptability. This article describes the different design and implementation phases of a SAR, the CLARA robot, both from a physical and software point of view, from 2016 to 2022. During this period, the design methodology evolved from traditional approaches based on technical feasibility to user-centered co-creative processes. The cognitive architecture of the robot, CORTEX, keeps its core idea of using an inner representation of the world to enable inter-procedural dialogue between perceptual, reactive, and deliberative modules. However, CORTEX also evolved by incorporating components that use non-functional properties to maximize efficiency through adaptability. The robot has been employed in several projects for different uses in hospitals and retirement homes. This paper describes the main outcomes of the functional and user experience evaluations of these experiments.This work has been partially funded by the EU ECHORD++ project (FP7-ICT-601116), the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 825003 (DIH-HERO SUSTAIN), the RoQME and MiRON Integrated Technical Projects funded, in turn, by the EU RobMoSys project (H20202-732410), the project RTI2018-099522-B-C41, funded by the Gobierno de España and FEDER funds, the AT17-5509-UMA and UMA18-FEDERJA-074 projects funded by the Junta de Andalucía, and the ARMORI (CEIATECH-10) and B1-2021_26 projects funded by the University of Málaga. Partial funding for open access charge: Universidad de Málaga

    Comparison of Human Social Brain Activity During Eye-Contact With Another Human and a Humanoid Robot

    Get PDF
    Robot design to simulate interpersonal social interaction is an active area of research with applications in therapy and companionship. Neural responses to eye-to-eye contact in humans have recently been employed to determine the neural systems that are active during social interactions. Whether eye-contact with a social robot engages the same neural system remains to be seen. Here, we employ a similar approach to compare human-human and human-robot social interactions. We assume that if human-human and human-robot eye-contact elicit similar neural activity in the human, then the perceptual and cognitive processing is also the same for human and robot. That is, the robot is processed similar to the human. However, if neural effects are different, then perceptual and cognitive processing is assumed to be different. In this study neural activity was compared for human-to-human and human-to-robot conditions using near infrared spectroscopy for neural imaging, and a robot (Maki) with eyes that blink and move right and left. Eye-contact was confirmed by eye-tracking for both conditions. Increased neural activity was observed in human social systems including the right temporal parietal junction and the dorsolateral prefrontal cortex during human-human eye contact but not human-robot eye-contact. This suggests that the type of human-robot eye-contact used here is not sufficient to engage the right temporoparietal junction in the human. This study establishes a foundation for future research into human-robot eye-contact to determine how elements of robot design and behavior impact human social processing within this type of interaction and may offer a method for capturing difficult to quantify components of human-robot interaction, such as social engagement

    Artificial Vision Algorithms for Socially Assistive Robot Applications: A Review of the Literature

    Get PDF
    Today, computer vision algorithms are very important for different fields and applications, such as closed-circuit television security, health status monitoring, and recognizing a specific person or object and robotics. Regarding this topic, the present paper deals with a recent review of the literature on computer vision algorithms (recognition and tracking of faces, bodies, and objects) oriented towards socially assistive robot applications. The performance, frames per second (FPS) processing speed, and hardware implemented to run the algorithms are highlighted by comparing the available solutions. Moreover, this paper provides general information for researchers interested in knowing which vision algorithms are available, enabling them to select the one that is most suitable to include in their robotic system applicationsBeca Conacyt Doctorado No de CVU: 64683

    Applications of Affective Computing in Human-Robot Interaction: state-of-art and challenges for manufacturing

    Get PDF
    The introduction of collaborative robots aims to make production more flexible, promoting a greater interaction between humans and robots also from physical point of view. However, working closely with a robot may lead to the creation of stressful situations for the operator, which can negatively affect task performance. In Human-Robot Interaction (HRI), robots are expected to be socially intelligent, i.e., capable of understanding and reacting accordingly to human social and affective clues. This ability can be exploited implementing affective computing, which concerns the development of systems able to recognize, interpret, process, and simulate human affects. Social intelligence is essential for robots to establish a natural interaction with people in several contexts, including the manufacturing sector with the emergence of Industry 5.0. In order to take full advantage of the human-robot collaboration, the robotic system should be able to perceive the psycho-emotional and mental state of the operator through different sensing modalities (e.g., facial expressions, body language, voice, or physiological signals) and to adapt its behaviour accordingly. The development of socially intelligent collaborative robots in the manufacturing sector can lead to a symbiotic human-robot collaboration, arising several research challenges that still need to be addressed. The goals of this paper are the following: (i) providing an overview of affective computing implementation in HRI; (ii) analyzing the state-of-art on this topic in different application contexts (e.g., healthcare, service applications, and manufacturing); (iii) highlighting research challenges for the manufacturing sector
    corecore