Learning dexterous manipulation skills is a long-standing challenge in
computer graphics and robotics, especially when the task involves complex and
delicate interactions between the hands, tools and objects. In this paper, we
focus on chopsticks-based object relocation tasks, which are common yet
demanding. The key to successful chopsticks skills is steady gripping of the
sticks that also supports delicate maneuvers. We automatically discover
physically valid chopsticks holding poses by Bayesian Optimization (BO) and
Deep Reinforcement Learning (DRL), which works for multiple gripping styles and
hand morphologies without the need of example data. Given as input the
discovered gripping poses and desired objects to be moved, we build
physics-based hand controllers to accomplish relocation tasks in two stages.
First, kinematic trajectories are synthesized for the chopsticks and hand in a
motion planning stage. The key components of our motion planner include a
grasping model to select suitable chopsticks configurations for grasping the
object, and a trajectory optimization module to generate collision-free
chopsticks trajectories. Then we train physics-based hand controllers through
DRL again to track the desired kinematic trajectories produced by the motion
planner. We demonstrate the capabilities of our framework by relocating objects
of various shapes and sizes, in diverse gripping styles and holding positions
for multiple hand morphologies. Our system achieves faster learning speed and
better control robustness, when compared to vanilla systems that attempt to
learn chopstick-based skills without a gripping pose optimization module and/or
without a kinematic motion planner