1,295 research outputs found
Realtime State Estimation with Tactile and Visual sensing. Application to Planar Manipulation
Accurate and robust object state estimation enables successful object
manipulation. Visual sensing is widely used to estimate object poses. However,
in a cluttered scene or in a tight workspace, the robot's end-effector often
occludes the object from the visual sensor. The robot then loses visual
feedback and must fall back on open-loop execution.
In this paper, we integrate both tactile and visual input using a framework
for solving the SLAM problem, incremental smoothing and mapping (iSAM), to
provide a fast and flexible solution. Visual sensing provides global pose
information but is noisy in general, whereas contact sensing is local, but its
measurements are more accurate relative to the end-effector. By combining them,
we aim to exploit their advantages and overcome their limitations. We explore
the technique in the context of a pusher-slider system. We adapt iSAM's
measurement cost and motion cost to the pushing scenario, and use an
instrumented setup to evaluate the estimation quality with different object
shapes, on different surface materials, and under different contact modes
Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation
This paper aims to show that robots equipped with a vision-based tactile
sensor can perform dynamic manipulation tasks without prior knowledge of all
the physical attributes of the objects to be manipulated. For this purpose, a
robotic system is presented that is able to swing up poles of different masses,
radii and lengths, to an angle of 180 degrees, while relying solely on the
feedback provided by the tactile sensor. This is achieved by developing a novel
simulator that accurately models the interaction of a pole with the soft
sensor. A feedback policy that is conditioned on a sensory observation history,
and which has no prior knowledge of the physical features of the pole, is then
learned in the aforementioned simulation. When evaluated on the physical
system, the policy is able to swing up a wide range of poles that differ
significantly in their physical attributes without further adaptation. To the
authors' knowledge, this is the first work where a feedback policy from
high-dimensional tactile observations is used to control the swing-up
manipulation of poles in closed-loop.Comment: Accompanying video: https://youtu.be/4rG-o2Cz3-
Innovative robot hand designs of reduced complexity for dexterous manipulation
This thesis investigates the mechanical design of robot hands to sensibly reduce the system complexity in terms of the number of actuators and sensors, and control needs for performing grasping and in-hand manipulations of unknown objects.
Human hands are known to be the most complex, versatile, dexterous manipulators in nature, from being able to operate sophisticated surgery to carry out a wide variety of daily activity tasks (e.g. preparing food, changing cloths, playing instruments, to name some). However, the understanding of why human hands can perform such fascinating tasks still eludes complete comprehension.
Since at least the end of the sixteenth century, scientists and engineers have tried to match the sensory and motor functions of the human hand. As a result, many contemporary humanoid and anthropomorphic robot hands have been developed to closely replicate the appearance and dexterity of human hands, in many cases using sophisticated designs that integrate multiple sensors and actuators---which make them prone to error and difficult to operate and control, particularly under uncertainty.
In recent years, several simplification approaches and solutions have been proposed to develop more effective and reliable dexterous robot hands. These techniques, which have been based on using underactuated mechanical designs, kinematic synergies, or compliant materials, to name some, have opened up new ways to integrate hardware enhancements to facilitate grasping and dexterous manipulation control and improve reliability and robustness.
Following this line of thought, this thesis studies four robot hand hardware aspects for enhancing grasping and manipulation, with a particular focus on dexterous in-hand manipulation. Namely: i) the use of passive soft fingertips; ii) the use of rigid and soft active surfaces in robot fingers; iii) the use of robot hand topologies to create particular in-hand manipulation trajectories; and iv) the decoupling of grasping and in-hand manipulation by introducing a reconfigurable palm.
In summary, the findings from this thesis provide important notions for understanding the significance of mechanical and hardware elements in the performance and control of human manipulation. These findings show great potential in developing robust, easily programmable, and economically viable robot hands capable of performing dexterous manipulations under uncertainty, while exhibiting a valuable subset of functions of the human hand.Open Acces
A Robust Controller for Stable 3D Pinching using Tactile Sensing
This paper proposes a controller for stable grasping of unknown-shaped
objects by two robotic fingers with tactile fingertips. The grasp is stabilised
by rolling the fingertips on the contact surface and applying a desired
grasping force to reach an equilibrium state. The validation is both in
simulation and on a fully-actuated robot hand (the Shadow Modular Grasper)
fitted with custom-built optical tactile sensors (based on the BRL TacTip). The
controller requires the orientations of the contact surfaces, which are
estimated by regressing a deep convolutional neural network over the tactile
images. Overall, the grasp system is demonstrated to achieve stable equilibrium
poses on various objects ranging in shape and softness, with the system being
robust to perturbations and measurement errors. This approach also has promise
to extend beyond grasping to stable in-hand object manipulation with multiple
fingers.Comment: 8 pages, 10 figures, 1 appendix. Accepted for publication in IEEE
Robotics and Automation Letters and in IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS 2021). Supplemental video:
https://youtu.be/rfQesw3FDA
Objekt-Manipulation und Steuerung der Greifkraft durch Verwendung von Taktilen Sensoren
This dissertation describes a new type of tactile sensor and an improved version of the dynamic tactile sensing approach that can provide a regularly updated and accurate estimate of minimum applied forces for use in the control of gripper manipulation. The pre-slip sensing algorithm is proposed and implemented into two-finger robot gripper. An algorithm that can discriminate between types of contact surface and recognize objects at the contact stage is also proposed. A technique for recognizing objects using tactile sensor arrays, and a method based on the quadric surface parameter for classifying grasped objects is described. Tactile arrays can recognize surface types on contact, making it possible for a tactile system to recognize translation, rotation, and scaling of an object independently.Diese Dissertation beschreibt eine neue Art von taktilen Sensoren und einen verbesserten Ansatz zur dynamischen Erfassung von taktilen daten, der in regelmäßigen Zeitabständen eine genaue Bewertung der minimalen Greifkraft liefert, die zur Steuerung des Greifers nötig ist. Ein Berechnungsverfahren zur Voraussage des Schlupfs, das in einen Zwei-Finger-Greifarm eines Roboters eingebaut wurde, wird vorgestellt. Auch ein Algorithmus zur Unterscheidung von verschiedenen Oberflächenarten und zur Erkennung von Objektformen bei der Berührung wird vorgestellt. Ein Verfahren zur Objekterkennung mit Hilfe einer Matrix aus taktilen Sensoren und eine Methode zur Klassifikation ergriffener Objekte, basierend auf den Daten einer rechteckigen Oberfläche, werden beschrieben. Mit Hilfe dieser Matrix können unter schiedliche Arten von Oberflächen bei Berührung erkannt werden, was es für das Tastsystem möglich macht, Verschiebung, Drehung und Größe eines Objektes unabhängig voneinander zu erkennen
Soft Fingertips with Tactile Sensing and Active Deformation for Robust Grasping of Delicate Objects
Soft fingertips have shown significant adaptability for grasping a wide range of object shapes thanks to elasticity. This ability can be enhanced to grasp soft, delicate objects by adding touch sensing. However, in these cases, the complete restraint and robustness of the grasps have proved to be challenging, as the exertion of additional forces on the fragile object can result in damage. This paper presents a novel soft fingertip design for delicate objects based on the concept of embedded air cavities, which allow the dual ability of adaptive sensing and active shape changing. The pressurized air cavities act as soft tactile sensors to control gripper position from internal pressure variation; and active fingertip deformation is achieved by applying positive pressure to these cavities, which then enable a delicate object to be kept securely in position, despite externally applied forces, by form closure. We demonstrate this improved grasping capability by comparing the displacement of grasped delicate objects exposed to high-speed motions. Results show that passive soft fingertips fail to restrain fragile objects at accelerations as low as 0.1m/s2 , in contrast, with the proposed fingertips, delicate objects are completely secure even at accelerations of more than 5m/s2
Proprioceptive Learning with Soft Polyhedral Networks
Proprioception is the "sixth sense" that detects limb postures with motor
neurons. It requires a natural integration between the musculoskeletal systems
and sensory receptors, which is challenging among modern robots that aim for
lightweight, adaptive, and sensitive designs at a low cost. Here, we present
the Soft Polyhedral Network with an embedded vision for physical interactions,
capable of adaptive kinesthesia and viscoelastic proprioception by learning
kinetic features. This design enables passive adaptations to omni-directional
interactions, visually captured by a miniature high-speed motion tracking
system embedded inside for proprioceptive learning. The results show that the
soft network can infer real-time 6D forces and torques with accuracies of
0.25/0.24/0.35 N and 0.025/0.034/0.006 Nm in dynamic interactions. We also
incorporate viscoelasticity in proprioception during static adaptation by
adding a creep and relaxation modifier to refine the predicted results. The
proposed soft network combines simplicity in design, omni-adaptation, and
proprioceptive sensing with high accuracy, making it a versatile solution for
robotics at a low cost with more than 1 million use cycles for tasks such as
sensitive and competitive grasping, and touch-based geometry reconstruction.
This study offers new insights into vision-based proprioception for soft robots
in adaptive grasping, soft manipulation, and human-robot interaction.Comment: 20 pages, 10 figures, 2 tables, submitted to the International
Journal of Robotics Research for revie
- …