236,720 research outputs found
A bistable soft gripper with mechanically embedded sensing and actuation for fast closed-loop grasping
Soft robotic grippers are shown to be high effective for grasping
unstructured objects with simple sensing and control strategies. However, they
are still limited by their speed, sensing capabilities and actuation mechanism.
Hence, their usage have been restricted in highly dynamic grasping tasks. This
paper presents a soft robotic gripper with tunable bistable properties for
sensor-less dynamic grasping. The bistable mechanism allows us to store
arbitrarily large strain energy in the soft system which is then released upon
contact. The mechanism also provides flexibility on the type of actuation
mechanism as the grasping and sensing phase is completely passive. Theoretical
background behind the mechanism is presented with finite element analysis to
provide insights into design parameters. Finally, we experimentally demonstrate
sensor-less dynamic grasping of an unknown object within 0.02 seconds,
including the time to sense and actuate
Finding antipodal point grasps on irregularly shaped objects
Two-finger antipodal point grasping of arbitrarily shaped smooth 2-D and 3-D objects is considered. An object function is introduced that maps a finger contact space to the object surface. Conditions are developed to identify the feasible grasping region, F, in the finger contact space. A “grasping energy function”, E , is introduced which is proportional to the distance between two grasping points. The antipodal points correspond to critical points of E in F. Optimization and/or continuation techniques are used to find these critical points. In particular, global optimization techniques are applied to find the “maximal” or “minimal” grasp. Further, modeling techniques are introduced for representing 2-D and 3-D objects using B-spline curves and spherical product surfaces
Human Robot Interface for Assistive Grasping
This work describes a new human-in-the-loop (HitL) assistive grasping system
for individuals with varying levels of physical capabilities. We investigated
the feasibility of using four potential input devices with our assistive
grasping system interface, using able-bodied individuals to define a set of
quantitative metrics that could be used to assess an assistive grasping system.
We then took these measurements and created a generalized benchmark for
evaluating the effectiveness of any arbitrary input device into a HitL grasping
system. The four input devices were a mouse, a speech recognition device, an
assistive switch, and a novel sEMG device developed by our group that was
connected either to the forearm or behind the ear of the subject. These
preliminary results provide insight into how different interface devices
perform for generalized assistive grasping tasks and also highlight the
potential of sEMG based control for severely disabled individuals.Comment: 8 pages, 21 figure
More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
For humans, the process of grasping an object relies heavily on rich tactile
feedback. Most recent robotic grasping work, however, has been based only on
visual input, and thus cannot easily benefit from feedback after initiating
contact. In this paper, we investigate how a robot can learn to use tactile
information to iteratively and efficiently adjust its grasp. To this end, we
propose an end-to-end action-conditional model that learns regrasping policies
from raw visuo-tactile data. This model -- a deep, multimodal convolutional
network -- predicts the outcome of a candidate grasp adjustment, and then
executes a grasp by iteratively selecting the most promising actions. Our
approach requires neither calibration of the tactile sensors, nor any
analytical modeling of contact forces, thus reducing the engineering effort
required to obtain efficient grasping policies. We train our model with data
from about 6,450 grasping trials on a two-finger gripper equipped with GelSight
high-resolution tactile sensors on each finger. Across extensive experiments,
our approach outperforms a variety of baselines at (i) estimating grasp
adjustment outcomes, (ii) selecting efficient grasp adjustments for quick
grasping, and (iii) reducing the amount of force applied at the fingers, while
maintaining competitive performance. Finally, we study the choices made by our
model and show that it has successfully acquired useful and interpretable
grasping behaviors.Comment: 8 pages. Published on IEEE Robotics and Automation Letters (RAL).
Website: https://sites.google.com/view/more-than-a-feelin
- …
