170,501 research outputs found
Shape and Pose Recovery from Planar Pushing
Tactile exploration refers to the use of physical interaction to infer object properties. In this work, we study the feasibility of recovering the shape and pose of a movable object from observing a series of contacts. In particular, we approach the problem of estimating the shape and trajectory of a planar object lying on a frictional surface, and being pushed by a frictional probe. The probe, when in contact with the object, makes observations of the location of contact and the contact normal.
Our approach draws inspiration from the SLAM problem, where noisy observations of the location of landmarks are used to reconstruct and locate a static environment. In tactile exploration, analogously, we can think of the object as a rigid but moving environment, and of the pusher as a sensor that reports contact points on the boundary of the object.
A key challenge to tactile exploration is that, unlike visual feedback, sensing by touch is intrusive in nature. The object moves by the action of sensing. In the 2D version of the problem that we study in this paper, the well understood mechanics of planar frictional pushing provides a motion model that plays the role of odometry. The conjecture we investigate in this paper is whether the models of frictional pushing are sufficiently descriptive to simultaneously estimate the shape and pose of an object from the cumulative effect of a sequence of pushes.National Science Foundation (U.S.) (Award IIS-1427050
Pose and motion from contact
In the absence of vision, grasping an object often relies on tactile feedback from the fingertips. As the finger pushes the object, the fingertip can feel the contact point move. If the object is known in advance, from this motion the finger may infer the location of the contact point on the object and thereby the object pose. This paper primarily investigates the problem of determining the pose (orientation and position) and motion (velocity and angular velocity) of a planar object with known geometry from such contact motion generated by pushing. A dynamic analysis of pushing yields a nonlinear system that relates through contact the object pose and motion to the finger motion. The contact motion on the fingertip thus encodes certain information about the object pose. Nonlinear observability theory is employed to show that such information is sufficient for the finger to “observe ” not only the pose but also the motion of the object. Therefore a sensing strategy can be realized as an observer of the nonlinear dynamical system. Two observers are subsequently introduced. The first observer, based on the result of [15], has its “gain ” determined by the solution of a Lyapunov-like equation; it can be activated at any time instant during a push. The second observer, based on Newton’s method, solves for the initial (motionless) object pose from three intermediate contact points during a push. Under the Coulomb friction model, the paper copes with support friction in the plane and/or contact friction between the finger and the object. Extensive simulations have been done to demonstrate the feasibility of the two observers. Preliminary experiments (with an Adept robot) have also been conducted. A contact sensor has been implemented using strain gauges.
Realtime State Estimation with Tactile and Visual sensing. Application to Planar Manipulation
Accurate and robust object state estimation enables successful object
manipulation. Visual sensing is widely used to estimate object poses. However,
in a cluttered scene or in a tight workspace, the robot's end-effector often
occludes the object from the visual sensor. The robot then loses visual
feedback and must fall back on open-loop execution.
In this paper, we integrate both tactile and visual input using a framework
for solving the SLAM problem, incremental smoothing and mapping (iSAM), to
provide a fast and flexible solution. Visual sensing provides global pose
information but is noisy in general, whereas contact sensing is local, but its
measurements are more accurate relative to the end-effector. By combining them,
we aim to exploit their advantages and overcome their limitations. We explore
the technique in the context of a pusher-slider system. We adapt iSAM's
measurement cost and motion cost to the pushing scenario, and use an
instrumented setup to evaluate the estimation quality with different object
shapes, on different surface materials, and under different contact modes
Self-supervised 6D Object Pose Estimation for Robot Manipulation
To teach robots skills, it is crucial to obtain data with supervision. Since
annotating real world data is time-consuming and expensive, enabling robots to
learn in a self-supervised way is important. In this work, we introduce a robot
system for self-supervised 6D object pose estimation. Starting from modules
trained in simulation, our system is able to label real world images with
accurate 6D object poses for self-supervised learning. In addition, the robot
interacts with objects in the environment to change the object configuration by
grasping or pushing objects. In this way, our system is able to continuously
collect data and improve its pose estimation modules. We show that the
self-supervised learning improves object segmentation and 6D pose estimation
performance, and consequently enables the system to grasp objects more
reliably. A video showing the experiments can be found at
https://youtu.be/W1Y0Mmh1Gd8.Comment: Accepted to International Conference on Robotics and Automation
(ICRA), 202
- …