2 research outputs found

    Studying People's Emotional Responses to Robot's Movements

    Get PDF
    With a deeper interaction between robots and humans, the emotional rapport between the two is becoming ever more important. The interpretation of emotion expressed by robots has been widely studied with humanoids and animal-like robots, which try to mimic biological beings similar to those people is used to interact with. Considering the uncanny valley issue and the practical and theoretical questions related to implement bio-inspired robots, it may be argued whether also object-like robots can express emotions so that people can satisfactorily interact with robots that can have functional shapes, not necessarily bio-insipired. This paper presents some study cases done to identify body features that allow emotion projection from an object-like robot body. The study was done in two phases: a pilot experiment, and a formal trial. The results show that is possible to project different emotions by exploiting angular and linear velocity of the robot

    Robots showing emotions: Emotion representation with no bio-inspired body

    Get PDF
    Robots should be able to represent emotional states to interact with people as social agents. There are cases where robots cannot have bio-inspired bodies, for instance because the task to be performed requires a special shape, as in the case of home cleaners, package carriers, and many others. In these cases, emotional states have to be represented by exploiting movements of the body. In this paper, we present a set of case studies aimed at identifying specific values to convey emotion trough changes in linear and angular velocities, which might be applied on different non-anthropomorphic bodies. This work originates from some of the most considered emotion expression theories and from emotion coding for people. We show that people can recognize some emotional expressions better than others, and we propose some directions to express emotions exploiting only bio-neutral movement
    corecore