1,842 research outputs found

    Multi-contact haptic exploration and grasping with tactile sensors

    Get PDF
    Haptic exploration has received a great deal of attention of late thanks to the variety of commercially available tactile sensors. While the majority of previous works consider control of a single contact point at a time, we tackle simultaneous control of multiple contact points on several links. In addition, we use information from the existing tactile signals to increase the number of points in contact. We demonstrate the usefulness of this form of control to speed up exploration, scanning and to compliantly grasp unknown objects. Our controller requires to know only the parts of the robot on which it is desirable to make contact and does not need a model of the environment besides the robot itself. We validate the algorithm in a set of experiments using a robotic hand covered with tactile sensors and arm. In a grasping application, the active adaptation of the fingers to the shape of the object ensures that the hand encloses the object with multiple contact points. We show that this improves the robustness of the grasp compared to simple enclosing strategies. When combined with an exploration strategy, our multi-contact approach offers an efficient use of tactile sensors on the whole surface of robotic fingers, and enables the robot to perform a rapid exploration of complex, non convex shapes while maintaining low contact forces. It is robust to variation in the approach angle and to changes in the geometry and orientation of the object

    Learning to Represent Haptic Feedback for Partially-Observable Tasks

    Full text link
    The sense of touch, being the earliest sensory system to develop in a human body [1], plays a critical part of our daily interaction with the environment. In order to successfully complete a task, many manipulation interactions require incorporating haptic feedback. However, manually designing a feedback mechanism can be extremely challenging. In this work, we consider manipulation tasks that need to incorporate tactile sensor feedback in order to modify a provided nominal plan. To incorporate partial observation, we present a new framework that models the task as a partially observable Markov decision process (POMDP) and learns an appropriate representation of haptic feedback which can serve as the state for a POMDP model. The model, that is parametrized by deep recurrent neural networks, utilizes variational Bayes methods to optimize the approximate posterior. Finally, we build on deep Q-learning to be able to select the optimal action in each state without access to a simulator. We test our model on a PR2 robot for multiple tasks of turning a knob until it clicks.Comment: IEEE International Conference on Robotics and Automation (ICRA), 201

    Haptic Exploration of Unknown Objects for Robust in-hand Manipulation.

    Get PDF
    Human-like robot hands provide the flexibility to manipulate a variety of objects that are found in unstructured environments. Knowledge of object properties and motion trajectory is required, but often not available in real-world manipulation tasks. Although it is possible to grasp and manipulate unknown objects, an uninformed grasp leads to inferior stability, accuracy, and repeatability of the manipulation. Therefore, a central challenge of in-hand manipulation in unstructured environments is to acquire this information safely and efficiently. We propose an in-hand manipulation framework that does not assume any prior information about the object and the motion, but instead extracts the object properties through a novel haptic exploration procedure and learns the motion from demonstration using dynamical movement primitives. We evaluate our approach by unknown object manipulation experiments using a human-like robot hand. The results show that haptic exploration improves the manipulation robustness and accuracy significantly, compared to the virtual spring framework baseline method that is widely used for grasping unknown objects

    Visuo-Haptic Grasping of Unknown Objects through Exploration and Learning on Humanoid Robots

    Get PDF
    Die vorliegende Arbeit befasst sich mit dem Greifen unbekannter Objekte durch humanoide Roboter. Dazu werden visuelle Informationen mit haptischer Exploration kombiniert, um Greifhypothesen zu erzeugen. Basierend auf simulierten Trainingsdaten wird außerdem eine Greifmetrik gelernt, welche die Erfolgswahrscheinlichkeit der Greifhypothesen bewertet und die mit der größten geschätzten Erfolgswahrscheinlichkeit auswählt. Diese wird verwendet, um Objekte mit Hilfe einer reaktiven Kontrollstrategie zu greifen. Die zwei Kernbeiträge der Arbeit sind zum einen die haptische Exploration von unbekannten Objekten und zum anderen das Greifen von unbekannten Objekten mit Hilfe einer neuartigen datengetriebenen Greifmetrik

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte
    • …
    corecore