15 research outputs found
Getting to know Pepper : Effects of people’s awareness of a robot’s capabilities on their trust in the robot
© 2018 Association for Computing MachineryThis work investigates how human awareness about a social robot’s capabilities is related to trusting this robot to handle different tasks. We present a user study that relates knowledge on different quality levels to participant’s ratings of trust. Secondary school pupils were asked to rate their trust in the robot after three types of exposures: a video demonstration, a live interaction, and a programming task. The study revealed that the pupils’ trust is positively affected across different domains after each session, indicating that human users trust a robot more the more awareness about the robot they have
Recommended from our members
Automatic Replication of Teleoperator Head Movements and Facial Expressions on a Humanoid Robot
Robotic telepresence aims to create a physical presence for a remotely located human (teleoperator) by reproducing their verbal and nonverbal behaviours (e.g. speech, gestures, facial expressions) on a robotic platform. In this work, we propose a novel teleoperation system that combines the replication of facial expressions of emotions (neutral, disgust, happiness, and surprise) and head movements on the fly on the humanoid robot Nao. Robots' expression of emotions is constrained by their physical and behavioural capabilities. As the Nao robot has a static face, we use the LEDs located around its eyes to reproduce the teleoperator expressions of emotions. Using a web camera, we computationally detect the facial action units and measure the head pose of the operator. The emotion to be replicated is inferred from the detected action units by a neural network. Simultaneously, the measured head motion is smoothed and bounded to the robot's physical limits by applying a constrained-state Kalman filter. In order to evaluate the proposed system, we conducted a user study by asking 28 participants to use the replication system by displaying facial expressions and head movements while being recorded by a web camera. Subsequently, 18 external observers viewed the recorded clips via an online survey and assessed the quality of the robot's replication of the participants' behaviours. Our results show that the proposed teleoperation system can successfully communicate emotions and head movements, resulting in a high agreement among the external observers (ICC_E = 0.91, ICC_HP = 0.72).This work was funded by the EPSRC under its IDEAS Factory Sandpits call on Digital Personhood (Grant Ref· EP/L00416X/1)
Expressing Robot Personality through Talking Body Language
Social robots must master the nuances of human communication as a mean to convey an effective message and generate trust. It is well-known that non-verbal cues are very important in human interactions, and therefore a social robot should produce a body language coherent with its discourse. In this work, we report on a system that endows a humanoid robot with the ability to adapt its body language according to the sentiment of its speech. A combination of talking beat gestures with emotional cues such as eye lightings, body posture of voice intonation and volume permits a rich variety of behaviors. The developed approach is not purely reactive, and it easily allows to assign a kind of personality to the robot. We present several videos with the robot in two different scenarios, and showing discrete and histrionic personalities.This work has been partially supported by the Basque Government (IT900-16 and Elkartek 2018/00114), the Spanish Ministry of Economy and Competitiveness (RTI 2018-093337-B-100, MINECO/FEDER, EU)
Communicating Dominance in a Nonanthropomorphic Robot Using Locomotion
Dominance is a key aspect of interpersonal relationships. To what extent do nonverbal indicators related to dominance status translate to a nonanthropomorphic robot? An experiment (N = 25) addressed whether a mobile robot's motion style can influence people's perceptions of its status. Using concepts from improv theater literature, we developed two motion styles across three scenarios (robot makes lateral motions, approaches, and departs) to communicate a robot's dominance status through nonverbal expression. In agreement with the literature, participants described a motion style that was fast, in the foreground, and more animated as higher status than a motion style that was slow, in the periphery, and less animated. Participants used fewer negative emotion words to describe the robot with the purportedly high-status movements versus the purportedly low-status movements, but used more negative emotion words to describe the robot when it made departing motions that occurred in the same style. This result provides evidence that guidelines from improvisational theater for using nonverbal expression to perform interpersonal status can be applied to influence perception of a nonanthropomorphic robot's status, thus suggesting that useful models for more complicated behaviors might similarly be derived from performance literature and theory
Ambient Lights Influence Perception and Decision-Making
Today's computers are becoming ever more versatile. They are used in various applications, such as for education, entertainment, and information services. In other words, computers are often required to not only inform users of information but also communicate with them socially. Previous studies explored the design of ambient light displays and suggested that such systems can convey information to people in the periphery of their attention without distracting them from their primary work. However, they mainly focused on using ambient lights to convey certain information. It is still unclear whether and how the lights can influence people's perception and decision-making. To explore this, we performed three experiments using a ping-pong game, Ultimatum game, and Give-Some game, in which we attached an LED strip to the front-bottom of a computer monitor and had it display a set of light expressions. Our evaluation of the results suggested that expressive lights do affect human perception and decision-making. Participants liked and anthropomorphized the computer more when it displayed light animations. Particularly, they perceived the computer as positive and friendlier when it displayed green and low intensity light animation, while red and high intensity light animation was perceived as negative and more hostile. They consequently behaved with more tolerance and cooperation to the computer when it was positive compared with when it was negative. The findings can open up possibilities for the design of ambient light systems for various applications where human-machine interaction is needed
Design of a Huggable Social Robot with Affective Expressions Using Projected Images
We introduce Pepita, a caricatured huggable robot capable of sensing and conveying affective expressions by means of tangible gesture recognition and projected avatars. This study covers the design criteria, implementation and performance evaluation of the different characteristics of the form and function of this robot. The evaluation involves: (1) the exploratory study of the different features of the device, (2) design and performance evaluation of sensors for affective interaction employing touch, and (3) design and implementation of affective feedback using projected avatars. Results showed that the hug detection worked well for the intended application and the affective expressions made with projected avatars were appropriated for this robot. The questionnaires analyzing users’ perception provide us with insights to guide the future designs of similar interfaces