800 research outputs found
Dexterous Manipulation Graphs
We propose the Dexterous Manipulation Graph as a tool to address in-hand
manipulation and reposition an object inside a robot's end-effector. This graph
is used to plan a sequence of manipulation primitives so to bring the object to
the desired end pose. This sequence of primitives is translated into motions of
the robot to move the object held by the end-effector. We use a dual arm robot
with parallel grippers to test our method on a real system and show successful
planning and execution of in-hand manipulation
RealDex: Towards Human-like Grasping for Robotic Dexterous Hand
In this paper, we introduce RealDex, a pioneering dataset capturing authentic
dexterous hand grasping motions infused with human behavioral patterns,
enriched by multi-view and multimodal visual data. Utilizing a teleoperation
system, we seamlessly synchronize human-robot hand poses in real time. This
collection of human-like motions is crucial for training dexterous hands to
mimic human movements more naturally and precisely. RealDex holds immense
promise in advancing humanoid robot for automated perception, cognition, and
manipulation in real-world scenarios. Moreover, we introduce a cutting-edge
dexterous grasping motion generation framework, which aligns with human
experience and enhances real-world applicability through effectively utilizing
Multimodal Large Language Models. Extensive experiments have demonstrated the
superior performance of our method on RealDex and other open datasets. The
complete dataset and code will be made available upon the publication of this
work
A Grasping-centered Analysis for Cloth Manipulation
Compliant and soft hands have gained a lot of attention in the past decade
because of their ability to adapt to the shape of the objects, increasing their
effectiveness for grasping. However, when it comes to grasping highly flexible
objects such as textiles, we face the dual problem: it is the object that will
adapt to the shape of the hand or gripper. In this context, the classic grasp
analysis or grasping taxonomies are not suitable for describing textile objects
grasps. This work proposes a novel definition of textile object grasps that
abstracts from the robotic embodiment or hand shape and recovers concepts from
the early neuroscience literature on hand prehension skills. This framework
enables us to identify what grasps have been used in literature until now to
perform robotic cloth manipulation, and allows for a precise definition of all
the tasks that have been tackled in terms of manipulation primitives based on
regrasps. In addition, we also review what grippers have been used. Our
analysis shows how the vast majority of cloth manipulations have relied only on
one type of grasp, and at the same time we identify several tasks that need
more variety of grasp types to be executed successfully. Our framework is
generic, provides a classification of cloth manipulation primitives and can
inspire gripper design and benchmark construction for cloth manipulation.Comment: 13 pages, 4 figures, 4 tables. Accepted for publication at IEEE
Transactions on Robotic
Dexterous Functional Grasping
While there have been significant strides in dexterous manipulation, most of
it is limited to benchmark tasks like in-hand reorientation which are of
limited utility in the real world. The main benefit of dexterous hands over
two-fingered ones is their ability to pickup tools and other objects (including
thin ones) and grasp them firmly to apply force. However, this task requires
both a complex understanding of functional affordances as well as precise
low-level control. While prior work obtains affordances from human data this
approach doesn't scale to low-level control. Similarly, simulation training
cannot give the robot an understanding of real-world semantics. In this paper,
we aim to combine the best of both worlds to accomplish functional grasping for
in-the-wild objects. We use a modular approach. First, affordances are obtained
by matching corresponding regions of different objects and then a low-level
policy trained in sim is run to grasp it. We propose a novel application of
eigengrasps to reduce the search space of RL using a small amount of human data
and find that it leads to more stable and physically realistic motion. We find
that eigengrasp action space beats baselines in simulation and outperforms
hardcoded grasping in real and matches or outperforms a trained human
teleoperator. Results visualizations and videos at https://dexfunc.github.io/Comment: In CoRL 2023. Website at https://dexfunc.github.io
Robust Grasp with Compliant Multi-Fingered Hand
As robots find more and more applications in unstructured environments, the need for grippers able to grasp and manipulate a large variety of objects has brought consistent attention to the use of multi-fingered hands. The hardware development and the control of these devices have become one of the most active research subjects in the field of grasping and dexterous manipulation. Despite a large number of publications on grasp planning, grasping frameworks that strongly depend on information collected by touching the object are getting attention only in recent years. The objective of this thesis focuses on the development of a controller for a robotic system composed of a 7-dof collaborative arm + a 16-dof torque-controlled multi-fingered hand to successfully and robustly grasp various objects. The robustness of the grasp is increased through active interaction between the object and the arm/hand robotic system. Algorithms that rely on the kinematic model of the arm/hand system and its compliance characteristics are proposed and tested on real grasping applications. The obtained results underline the importance of taking advantage of information from hand-object contacts, which is necessary to achieve human-like abilities in grasping tasks
Improved GelSight Tactile Sensor for Measuring Geometry and Slip
A GelSight sensor uses an elastomeric slab covered with a reflective membrane
to measure tactile signals. It measures the 3D geometry and contact force
information with high spacial resolution, and successfully helped many
challenging robot tasks. A previous sensor, based on a semi-specular membrane,
produces high resolution but with limited geometry accuracy. In this paper, we
describe a new design of GelSight for robot gripper, using a Lambertian
membrane and new illumination system, which gives greatly improved geometric
accuracy while retaining the compact size. We demonstrate its use in measuring
surface normals and reconstructing height maps using photometric stereo. We
also use it for the task of slip detection, using a combination of information
about relative motions on the membrane surface and the shear distortions. Using
a robotic arm and a set of 37 everyday objects with varied properties, we find
that the sensor can detect translational and rotational slip in general cases,
and can be used to improve the stability of the grasp.Comment: IEEE/RSJ International Conference on Intelligent Robots and System
- …