2,025 research outputs found
Data-Driven Grasp Synthesis - A Survey
We review the work on data-driven grasp synthesis and the methodologies for
sampling and ranking candidate grasps. We divide the approaches into three
groups based on whether they synthesize grasps for known, familiar or unknown
objects. This structure allows us to identify common object representations and
perceptual processes that facilitate the employed data-driven grasp synthesis
technique. In the case of known objects, we concentrate on the approaches that
are based on object recognition and pose estimation. In the case of familiar
objects, the techniques use some form of a similarity matching to a set of
previously encountered objects. Finally for the approaches dealing with unknown
objects, the core part is the extraction of specific features that are
indicative of good grasps. Our survey provides an overview of the different
methodologies and discusses open problems in the area of robot grasping. We
also draw a parallel to the classical approaches that rely on analytic
formulations.Comment: 20 pages, 30 Figures, submitted to IEEE Transactions on Robotic
Data-Driven Grasp Synthesis—A Survey
We review the work on data-driven grasp synthesis and the methodologies for sampling and ranking candidate grasps. We divide the approaches into three groups based on whether they synthesize grasps for known, familiar, or unknown objects. This structure allows us to identify common object representations and perceptual processes that facilitate the employed data-driven grasp synthesis technique. In the case of known objects, we concentrate on the approaches that are based on object recognition and pose estimation. In the case of familiar objects, the techniques use some form of a similarity matching to a set of previously encountered objects. Finally, for the approaches dealing with unknown objects, the core part is the extraction of specific features that are indicative of good grasps. Our survey provides an overview of the different methodologies and discusses open problems in the area of robot grasping. We also draw a parallel to the classical approaches that rely on analytic formulations
Learning Continuous Grasping Function with a Dexterous Hand from Human Demonstrations
We propose to learn to generate grasping motion for manipulation with a
dexterous hand using implicit functions. With continuous time inputs, the model
can generate a continuous and smooth grasping plan. We name the proposed model
Continuous Grasping Function (CGF). CGF is learned via generative modeling with
a Conditional Variational Autoencoder using 3D human demonstrations. We will
first convert the large-scale human-object interaction trajectories to robot
demonstrations via motion retargeting, and then use these demonstrations to
train CGF. During inference, we perform sampling with CGF to generate different
grasping plans in the simulator and select the successful ones to transfer to
the real robot. By training on diverse human data, our CGF allows
generalization to manipulate multiple objects. Compared to previous planning
algorithms, CGF is more efficient and achieves significant improvement on
success rate when transferred to grasping with the real Allegro Hand. Our
project page is at https://jianglongye.com/cgf .Comment: Project page: https://jianglongye.com/cg
- …