8 research outputs found
I Can See Your Aim: Estimating User Attention From Gaze For Handheld Robot Collaboration
This paper explores the estimation of user attention in the setting of a
cooperative handheld robot: a robot designed to behave as a handheld tool but
that has levels of task knowledge. We use a tool-mounted gaze tracking system,
which, after modelling via a pilot study, we use as a proxy for estimating the
attention of the user. This information is then used for cooperation with users
in a task of selecting and engaging with objects on a dynamic screen. Via a
video game setup, we test various degrees of robot autonomy from fully
autonomous, where the robot knows what it has to do and acts, to no autonomy
where the user is in full control of the task. Our results measure performance
and subjective metrics and show how the attention model benefits the
interaction and preference of users.Comment: this is a corrected version of the one that was published at IROS
201
Rebellion and Obedience: The Effects of Intention Prediction in Cooperative Handheld Robots
Within this work, we explore intention inference for user actions in the
context of a handheld robot setup. Handheld robots share the shape and
properties of handheld tools while being able to process task information and
aid manipulation. Here, we propose an intention prediction model to enhance
cooperative task solving. The model derives intention from the user's gaze
pattern which is captured using a robot-mounted remote eye tracker. The
proposed model yields real-time capabilities and reliable accuracy up to 1.5s
prior to predicted actions being executed. We assess the model in an assisted
pick and place task and show how the robot's intention obedience or rebellion
affects the cooperation with the robot.Comment: submitted to iROS 2019. arXiv admin note: substantial text overlap
with arXiv:1810.0646