3,344 research outputs found

    How Do You Like Me in This: User Embodiment Preferences for Companion Agents

    Get PDF
    We investigate the relationship between the embodiment of an artificial companion and user perception and interaction with it. In a Wizard of Oz study, 42 users interacted with one of two embodiments: a physical robot or a virtual agent on a screen through a role-play of secretarial tasks in an office, with the companion providing essential assistance. Findings showed that participants in both condition groups when given the choice would prefer to interact with the robot companion, mainly for its greater physical or social presence. Subjects also found the robot less annoying and talked to it more naturally. However, this preference for the robotic embodiment is not reflected in the users’ actual rating of the companion or their interaction with it. We reflect on this contradiction and conclude that in a task-based context a user focuses much more on a companion’s behaviour than its embodiment. This underlines the feasibility of our efforts in creating companions that migrate between embodiments while maintaining a consistent identity from the user’s point of view

    A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    Get PDF
    Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed

    Socially intelligent robots that understand and respond to human touch

    Get PDF
    Touch is an important nonverbal form of interpersonal interaction which is used to communicate emotions and other social messages. As interactions with social robots are likely to become more common in the near future these robots should also be able to engage in tactile interaction with humans. Therefore, the aim of the research presented in this dissertation is to work towards socially intelligent robots that can understand and respond to human touch. To become a socially intelligent actor a robot must be able to sense, classify and interpret human touch and respond to this in an appropriate manner. To this end we present work that addresses different parts of this interaction cycle. The contributions of this dissertation are the following. We have made a touch gesture dataset available to the research community and have presented benchmark results. Furthermore, we have sparked interest into the new field of social touch recognition by organizing a machine learning challenge and have pinpointed directions for further research. Also, we have exposed potential difficulties for the recognition of social touch in more naturalistic settings. Moreover, the findings presented in this dissertation can help to inform the design of a behavioral model for robot pet companions that can understand and respond to human touch. Additionally, we have focused on the requirements for tactile interaction with robot pets for health care applications

    A systematic comparison of affective robot expression modalities

    Get PDF

    Comics, robots, fashion and programming: outlining the concept of actDresses

    Get PDF
    This paper concerns the design of physical languages for controlling and programming robotic consumer products. For this purpose we explore basic theories of semiotics represented in the two separate fields of comics and fashion, and how these could be used as resources in the development of new physical languages. Based on these theories, the design concept of actDresses is defined, and supplemented by three example scenarios of how the concept can be used for controlling, programming, and predicting the behaviour of robotic systems

    DESIGN AND EVALUATION OF A NONVERBAL COMMUNICATION PLATFORM BETWEEN ASSISTIVE ROBOTS AND THEIR USERS

    Get PDF
    Assistive robotics will become integral to the everyday lives of a human population that is increasingly mobile, older, urban-centric and networked. The overwhelming demands on healthcare delivery alone will compel the adoption of assistive robotics. How will we communicate with such robots, and how will they communicate with us? This research makes the case for a relatively \u27artificial\u27 mode of nonverbal human-robot communication that is non-disruptive, non-competitive, and non-invasive human-robot communication that we envision will be willingly invited into our private and working lives over time. This research proposes a non-verbal communication (NVC) platform be conveyed by familiar lights and sounds, and elaborated here are experiments with our NVC platform in a rehabilitation hospital. This NVC is embedded into the Assistive Robotic Table (ART), developed within our lab, that supports the well-being of an expanding population of older adults and those with limited mobility. The broader aim of this research is to afford people robot-assistants that exist and interact with them in the recesses, rather than in the foreground, of their intimate and social lives. With support from our larger research team, I designed and evaluated several alternative modes of nonverbal robot communication with the objective of establishing a nonverbal, human-robot communication loop that evolves with users and can be modified by users. The study was conducted with 10-13 clinicians -- doctors and occupational, physical, and speech therapists -- at a local rehabilitation hospital through three iterative design and evaluation phases and a final usability study session. For our test case at a rehabilitation hospital, medical staff iteratively refined our NVC platform, stated a willingness to use our platform, and declared NVC as a desirable research path. In addition, these clinicians provided the requirements for human-robot interaction (HRI) in clinical settings, suggesting great promise for our mode of human-robot communication for this and other applications and environments involving intimate HRI

    Social touch gesture recognition using random forest and boosting on distinct feature sets

    Get PDF
    Touch is a primary nonverbal communication channel used to communicate emotions or other social messages. Despite its importance, this channel is still very little explored in the affective computing field, as much more focus has been placed on visual and aural channels. In this paper, we investigate the possibility to automatically discriminate between different social touch types. We propose five distinct feature sets for describing touch behaviours captured by a grid of pressure sensors. These features are then combined together by using the Random Forest and Boosting methods for categorizing the touch gesture type. The proposed methods were evaluated on both the HAART (7 gesture types over different surfaces) and the CoST (14 gesture types over the same surface) datasets made available by the Social Touch Gesture Challenge 2015. Well above chance level performances were achieved with a 67% accuracy for the HAART and 59% for the CoST testing datasets respectively

    Social Touch Gesture Recognition using Random Forest and Boosting on Distinct Feature Sets

    Get PDF
    Touch is a primary nonverbal communication channel used to communicate emotions or other social messages. Despite its importance, this channel is still very little explored in the affective computing field, as much more focus has been placed on visual and aural channels. In this paper, we investigate the possibility to automatically discriminate between different social touch types. We propose five distinct feature sets for describing touch behaviours captured by a grid of pressure sensors. These features are then combined together by using the Random Forest and Boosting methods for categorizing the touch gesture type. The proposed methods were evaluated on both the HAART (7 gesture types over different surfaces) and the CoST (14 gesture types over the same surface) datasets made available by the Social Touch Gesture Challenge 2015. Well above chance level performances were achieved with a 67% accuracy for the HAART and 59% for the CoST testing datasets respectively
    • …
    corecore