12,677 research outputs found

    A motion system for social and animated robots

    Get PDF
    This paper presents an innovative motion system that is used to control the motions and animations of a social robot. The social robot Probo is used to study Human-Robot Interactions (HRI), with a special focus on Robot Assisted Therapy (RAT). When used for therapy it is important that a social robot is able to create an "illusion of life" so as to become a believable character that can communicate with humans. The design of the motion system in this paper is based on insights from the animation industry. It combines operator-controlled animations with low-level autonomous reactions such as attention and emotional state. The motion system has a Combination Engine, which combines motion commands that are triggered by a human operator with motions that originate from different units of the cognitive control architecture of the robot. This results in an interactive robot that seems alive and has a certain degree of "likeability". The Godspeed Questionnaire Series is used to evaluate the animacy and likeability of the robot in China, Romania and Belgium

    Design and Evaluation of the LOPES Exoskeleton Robot for Interactive Gait Rehabilitation

    Get PDF
    This paper introduces a newly developed gait rehabilitation device. The device, called LOPES, combines a freely translatable and 2-D-actuated pelvis segment with a leg exoskeleton containing three actuated rotational joints: two at the hip and one at the knee. The joints are impedance controlled to allow bidirectional mechanical interaction between the robot and the training subject. Evaluation measurements show that the device allows both a "pa- tient-in-charge" and "robot-in-charge" mode, in which the robot is controlled either to follow or to guide a patient, respectively. Electromyography (EMG) measurements (one subject) on eight important leg muscles, show that free walking in the device strongly resembles free treadmill walking; an indication that the device can offer task-specific gait training. The possibilities and limitations to using the device as gait measurement tool are also shown at the moment position measurements are not accurate enough for inverse-dynamical gait analysis

    In good company? : Perception of movement synchrony of a non-anthropomorphic robot

    Get PDF
    Copyright: © 2015 Lehmann et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Recent technological developments like cheap sensors and the decreasing costs of computational power have brought the possibility of robotic home companions within reach. In order to be accepted it is vital for these robots to be able to participate meaningfully in social interactions with their users and to make them feel comfortable during these interactions. In this study we investigated how people respond to a situation where a companion robot is watching its user. Specifically, we tested the effect of robotic behaviours that are synchronised with the actions of a human. We evaluated the effects of these behaviours on the robot’s likeability and perceived intelligence using an online video survey. The robot used was Care-O-bot®3, a non-anthropomorphic robot with a limited range of expressive motions. We found that even minimal, positively synchronised movements during an object-oriented task were interpreted by participants as engagement and created a positive disposition towards the robot. However, even negatively synchronised movements of the robot led to more positive perceptions of the robot, as compared to a robot that does not move at all. The results emphasise a) the powerful role that robot movements in general can have on participants’ perception of the robot, and b) that synchronisation of body movements can be a powerful means to enhance the positive attitude towards a non-anthropomorphic robot.Peer reviewe

    生命維持にかかわる生理現象を介した人間 : ロボットのコミュニケーションと身体情動モデルの設計

    Get PDF
    関西大学In this dissertation, we focus on physiological phenomena of robots as the expressive modality of their inner states and discuss the effectiveness of a robot expressing physiological phenomena, which are indispensable for living. We designed a body-emotion model showing the relationship between a) emotion as the inner state of the robot and b) physiological phenomena as physical changes, and we discuss the communication between humans and robots through involuntary physiological expression based on the model. In recent years, various robots for use in mental health care and communication support in medical/nursing care have been developed. The purpose of these systems is to enable communication between a robot and patients by an active approach of the robot through sound and body movement. In contrast to conventional approaches, our research is based on involuntary emotional expression through physiological phenomena of the robot. Physiological phenomena including breathing, heartbeat, and body temperature are essential functions for life activities, and these are closely related to the inner state of humans because physiological phenomena are caused by the emotional reaction of the limbic system transmitted via the autonomic nervous system. In human-robot communication through physical contact, we consider that physiological phenomena are one of the most important nonverbal modalities of the inner state as involuntary expressions. First, we focused on the robots\u27 expression of physiological phenomena, proposed the body-emotion model (BEM), which concerns the relationship between the inner state of robots and their involuntary physical reactions. We proposed a stuffed-toy robot system: BREAR―which has a mechanical structure to express the breathing, heartbeat, temperature and bodily movement. The result of experiment showed that a heartbeat, breathing and body temperature can express the robot\u27s living state and that the breathing speed is highly related to the robot\u27s emotion of arousal. We reviewed the experimental results and emotional generation mechanisms and discussed the design of the robot based on BEM. Based on our verification results, we determined that the design of the BEM-which involves the perception of the external situation, the matching with the memory, the change of the autonomic nervous parameter and the representation of the physiological phenomena - that is based on the relationship between the autonomic nervous system and emotional arousal is effective. Second, we discussed indirect communication between humans and robots through physiological phenomena - which consist of the breathing, heartbeats and body temperature - that express robots\u27 emotions. We set a situation with joint attention from the robot and user on emotional content and evaluated whether both the user\u27s emotional response to the content and the user\u27s impression of relationship between the user and the robot were changed by the physiological expressions of the robot. The results suggest that the physiological expression of the robot makes the user\u27s own emotions in the experience more excited or suppressed and that the robot\u27s expression increases impressions of closeness and sensitivity. Last, we discussed the future perspective of human-robot communication by physiological phenomena. Regarding the representation of the robots\u27 sense of life, it is thought that the user\u27s recognition that the robot is alive improves not only the moral effect on the understanding of the finiteness of life but also the attachment to the robot in long-term communication. Regarding the emotional expression mechanism based on life, it is expected that the robot can display a complicated internal state close to that of humans by combining intentionally expressed emotions and involuntary emotional expressions. If a robot can express a combination of realistic voluntary expressions, such as facial expressions and body movements, in combination with real involuntary expressions by using the real intentions and lying, it can be said that the robot has a more complicated internal state than that of a pet. By using a robot expressing a living state through physiological phenomena, it can be expected that the effect of mental care will exceed that of animal therapy, and we expect to provide care and welfare support in place of human beings

    Different discussions on roboethics and information ethics based on different contexts (BA). Discussions on robots, informatics and life in the information era in Japanese\ud bulletin board forums and mass media

    Get PDF
    In this paper, I will analyze „what sort of invisible reasons lie behind differences of discussions on roboethics and IE (Information Ethics) in Japan and “Western” cultures‟, focusing on (1) the recent trends of researches in roboethics in „Western‟ cultures, (2) the tendencies of portrayal of robots, ICTs, Informatics, life in the information era reflected in news papers reports and talks on BBSs in Japan. As we will see in this paper, Japanese people have difficulty in understanding some of the key concepts used in the fields of roboethics and IE (Information Ethics) such as „autonomy‟ or „responsibility (of robots)‟,etc. This difficulty appears to derive from different types of discussions based on of different cultural contexts (Ba) in which the majority of people in each culture are provided with a certain sort of shared/ normalized frames of narratives. In my view and according to some Japanese critics or authors, senses of „reality‟ of Japanese people are strongly related with "emotional sensitivity to things/persons/events in life" or "direct-non>mediated-intuitive\ud awareness/knowing" (Izutsu, 2001). These tendencies in Japanese minds seem to influence their limited interest in the "abstract" discussions as well\ud as in straightforward emotional expressions with regard to robots and ICTs

    Design of a Huggable Social Robot with Affective Expressions Using Projected Images

    Get PDF
    We introduce Pepita, a caricatured huggable robot capable of sensing and conveying affective expressions by means of tangible gesture recognition and projected avatars. This study covers the design criteria, implementation and performance evaluation of the different characteristics of the form and function of this robot. The evaluation involves: (1) the exploratory study of the different features of the device, (2) design and performance evaluation of sensors for affective interaction employing touch, and (3) design and implementation of affective feedback using projected avatars. Results showed that the hug detection worked well for the intended application and the affective expressions made with projected avatars were appropriated for this robot. The questionnaires analyzing users’ perception provide us with insights to guide the future designs of similar interfaces
    corecore