This paper addresses the problem of simultaneously exploring an unknown
object to model its shape, using tactile sensors on robotic fingers, while also
improving finger placement to optimise grasp stability. In many situations, a
robot will have only a partial camera view of the near side of an observed
object, for which the far side remains occluded. We show how an initial grasp
attempt, based on an initial guess of the overall object shape, yields tactile
glances of the far side of the object which enable the shape estimate and
consequently the successive grasps to be improved. We propose a grasp
exploration approach using a probabilistic representation of shape, based on
Gaussian Process Implicit Surfaces. This representation enables initial partial
vision data to be augmented with additional data from successive tactile
glances. This is combined with a probabilistic estimate of grasp quality to
refine grasp configurations. When choosing the next set of finger placements, a
bi-objective optimisation method is used to mutually maximise grasp quality and
improve shape representation during successive grasp attempts. Experimental
results show that the proposed approach yields stable grasp configurations more
efficiently than a baseline method, while also yielding improved shape estimate
of the grasped object.Comment: IEEE Robotics and Automation Letters. Preprint Version. Accepted
February, 202