Our brains are able to exploit coarse physical models of fluids to solve
everyday manipulation tasks. There has been considerable interest in developing
such a capability in robots so that they can autonomously manipulate fluids
adapting to different conditions. In this paper, we investigate the problem of
adaptation to liquids with different characteristics. We develop a simple
calibration task (stirring with a stick) that enables rapid inference of the
parameters of the liquid from RBG data. We perform the inference in the space
of simulation parameters rather than on physically accurate parameters. This
facilitates prediction and optimization tasks since the inferred parameters may
be fed directly to the simulator. We demonstrate that our "stirring" learner
performs better than when the robot is calibrated with pouring actions. We show
that our method is able to infer properties of three different liquids --
water, glycerin and gel -- and present experimental results by executing
stirring and pouring actions on a UR10. We believe that decoupling of the
training actions from the goal task is an important step towards simple,
autonomous learning of the behavior of different fluids in unstructured
environments.Comment: Presented at the Modeling the Physical World: Perception, Learning,
and Control Workshop (NeurIPS) 201