11 research outputs found

    Pose and motion from contact

    Get PDF
    In the absence of vision, grasping an object often relies on tactile feedback from the fingertips. As the finger pushes the object, the fingertip can feel the contact point move. If the object is known in advance, from this motion the finger may infer the location of the contact point on the object and thereby the object pose. This paper primarily investigates the problem of determining the pose (orientation and position) and motion (velocity and angular velocity) of a planar object with known geometry from such contact motion generated by pushing. A dynamic analysis of pushing yields a nonlinear system that relates through contact the object pose and motion to the finger motion. The contact motion on the fingertip thus encodes certain information about the object pose. Nonlinear observability theory is employed to show that such information is sufficient for the finger to “observe ” not only the pose but also the motion of the object. Therefore a sensing strategy can be realized as an observer of the nonlinear dynamical system. Two observers are subsequently introduced. The first observer, based on the result of [15], has its “gain ” determined by the solution of a Lyapunov-like equation; it can be activated at any time instant during a push. The second observer, based on Newton’s method, solves for the initial (motionless) object pose from three intermediate contact points during a push. Under the Coulomb friction model, the paper copes with support friction in the plane and/or contact friction between the finger and the object. Extensive simulations have been done to demonstrate the feasibility of the two observers. Preliminary experiments (with an Adept robot) have also been conducted. A contact sensor has been implemented using strain gauges.

    More than a Million Ways to Be Pushed: A High-Fidelity Experimental Dataset of Planar Pushing

    Get PDF
    Pushing is a motion primitive useful to handle objects that are too large, too heavy, or too cluttered to be grasped. It is at the core of much of robotic manipulation, in particular when physical interaction is involved. It seems reasonable then to wish for robots to understand how pushed objects move. In reality, however, robots often rely on approximations which yield models that are computable, but also restricted and inaccurate. Just how close are those models? How reasonable are the assumptions they are based on? To help answer these questions, and to get a better experimental understanding of pushing, we present a comprehensive and high-fidelity dataset of planar pushing experiments. The dataset contains timestamped poses of a circular pusher and a pushed object, as well as forces at the interaction.We vary the push interaction in 6 dimensions: surface material, shape of the pushed object, contact position, pushing direction, pushing speed, and pushing acceleration. An industrial robot automates the data capturing along precisely controlled position-velocity-acceleration trajectories of the pusher, which give dense samples of positions and forces of uniform quality. We finish the paper by characterizing the variability of friction, and evaluating the most common assumptions and simplifications made by models of frictional pushing in robotics.Comment: 8 pages, 10 figure

    Friction Variability in Planar Pushing Data: Anisotropic Friction and Data-collection Bias

    Full text link
    Friction plays a key role in manipulating objects. Most of what we do with our hands, and most of what robots do with their grippers, is based on the ability to control frictional forces. This paper aims to better understand the variability and predictability of planar friction. In particular, we focus on the analysis of a recent dataset on planar pushing by Yu et al. [1] devised to create a data-driven footprint of planar friction. We show in this paper how we can explain a significant fraction of the observed unconventional phenomena, e.g., stochasticity and multi-modality, by combining the effects of material non-homogeneity, anisotropy of friction and biases due to data collection dynamics, hinting that the variability is explainable but inevitable in practice. We introduce an anisotropic friction model and conduct simulation experiments comparing with more standard isotropic friction models. The anisotropic friction between object and supporting surface results in convergence of initial condition during the automated data collection. Numerical results confirm that the anisotropic friction model explains the bias in the dataset and the apparent stochasticity in the outcome of a push. The fact that the data collection process itself can originate biases in the collected datasets, resulting in deterioration of trained models, calls attention to the data collection dynamics.Comment: 8 pages, 13 figure

    Localization and Manipulation of Small Parts Using GelSight Tactile Sensing

    Get PDF
    Robust manipulation and insertion of small parts can be challenging because of the small tolerances typically involved. The key to robust control of these kinds of manipulation interactions is accurate tracking and control of the parts involved. Typically, this is accomplished using visual servoing or force-based control. However, these approaches have drawbacks. Instead, we propose a new approach that uses tactile sensing to accurately localize the pose of a part grasped in the robot hand. Using a feature-based matching technique in conjunction with a newly developed tactile sensing technology known as GelSight that has much higher resolution than competing methods, we synthesize high-resolution height maps of object surfaces. As a result of these high-resolution tactile maps, we are able to localize small parts held in a robot hand very accurately. We quantify localization accuracy in benchtop experiments and experimentally demonstrate the practicality of the approach in the context of a small parts insertion problem.National Science Foundation (U.S.) (NSF Grant No. 1017862)United States. National Aeronautics and Space Administration (NASA under Grant No. NNX13AQ85G)United States. Office of Naval Research (ONR Grant No. N000141410047

    Shape and Pose Recovery from Planar Pushing

    Get PDF
    Tactile exploration refers to the use of physical interaction to infer object properties. In this work, we study the feasibility of recovering the shape and pose of a movable object from observing a series of contacts. In particular, we approach the problem of estimating the shape and trajectory of a planar object lying on a frictional surface, and being pushed by a frictional probe. The probe, when in contact with the object, makes observations of the location of contact and the contact normal. Our approach draws inspiration from the SLAM problem, where noisy observations of the location of landmarks are used to reconstruct and locate a static environment. In tactile exploration, analogously, we can think of the object as a rigid but moving environment, and of the pusher as a sensor that reports contact points on the boundary of the object. A key challenge to tactile exploration is that, unlike visual feedback, sensing by touch is intrusive in nature. The object moves by the action of sensing. In the 2D version of the problem that we study in this paper, the well understood mechanics of planar frictional pushing provides a motion model that plays the role of odometry. The conjecture we investigate in this paper is whether the models of frictional pushing are sufficiently descriptive to simultaneously estimate the shape and pose of an object from the cumulative effect of a sequence of pushes.National Science Foundation (U.S.) (Award IIS-1427050

    Generative and predictive models for robust manipulation

    Get PDF
    Probabilistic modelling of manipulation skills, perception and uncertainty pose many challenges at different stages of a typical robot manipulation pipeline. This thesis is about devising algorithms and strategies for improving robustness in object manipulation skills acquired from demonstration and derived from learnt physical models in non-prehensile tasks such as pushing. Manipulation skills can be made robust in different ways: first by improving time performance for grasp synthesis, second by employing active perceptual strategies that exploit generated grasp action hypothesis to more efficiently gather task-relevant information for grasp generation, and finally via exploiting predictive uncertainty in learnt physical models. Hence, robust manipulation skills emerge from the interplay of a triad of capabilities: generative modelling for action synthesis, active perception, and finally learning and exploiting uncertainty in physical interactions. This thesis addresses these problems by • Showing how parametric models for approximating multimodal distributions can be used as a computationally faster method for generative grasp synthesis. • Exploiting generative methods for dexterous grasp synthesis and investigating how active vision strategies can be applied to improve grasp execution safety, success rate, and utilise fewer camera views of an object for grasp generation. • Outlining methods to model and exploit predictive uncertainty from learnt forward models to achieve robust, uncertainty-averse non-prehensile manipulation, such as push manipulation. In particular, the thesis: (i) presents a framework for generative grasp synthesis with applications for real-time grasp synthesis suitable for multi-fingered robot hands; (ii) describes a sensorisation method for under-actuated hands, such as the Pisa/IIT SoftHand, which allows us to deploy the aforementioned grasp synthesis framework to this type of robotic hand; (iii) provides an active vision approach for view selection that makes use of generative grasp synthesis methods to perform perceptual predictions in order to leverage grasp performance, taking into account grasp execution safety and contact information; and (iv) finally, going beyond prehensile skills, provides an approach to model and exploit predictive uncertainty from learnt physics applied to push manipulation. Experimental results are presented in simulation and on real robot platforms to validate the proposed methods

    Pose and Motion from Contact

    No full text
    In the absence of vision, grasping an object often relies on tactile feedback from the fingertips. As the finger pushes the object, the fingertip can feel the contact point move. If the object is known in advance, from this motion the finger may infer the location of the contact point on the object and thereby the object pose. This paper primarily investigates the problem of determining the pose (orientation and position) and motion (velocity and angular velocity) of a planar object with known geometry from such contact motion generated by pushing. A dynamic analysis of pushing yields a nonlinear system that relates through contact the object pose and motion to the finger motion. The contact motion on the fingertip thus encodes certain information about the object pose. Nonlinear observability theory is employed to show that such information is sufficient for the finger to ``observe'' not only the pose but also the motion of the object. Therefore a sensing strategy can be realized as an observer of the nonlinear dynamical system. Two observers are subsequently introduced. The first observer, based on the result of Gauthier et al. (1992), has its ``gain'' determined by the solution of a Lyapunov-like equation; it can be activated at any time instant during a push. The second observer, based on Newton's method, solves for the initial (motionless) object pose from three intermediate contact points during a push. Under the Coulomb friction model, the paper copes with support friction in the plane and/or contact friction between the finger and the object. Extensive simulations have been done to demonstrate the feasibility of the two observers. Preliminary experiments (with an Adept robot) have also been conducted. A contact sensor has been implemented using strain gauges
    corecore