557 research outputs found

    Visuo-Haptic Grasping of Unknown Objects through Exploration and Learning on Humanoid Robots

    Get PDF
    Die vorliegende Arbeit befasst sich mit dem Greifen unbekannter Objekte durch humanoide Roboter. Dazu werden visuelle Informationen mit haptischer Exploration kombiniert, um Greifhypothesen zu erzeugen. Basierend auf simulierten Trainingsdaten wird außerdem eine Greifmetrik gelernt, welche die Erfolgswahrscheinlichkeit der Greifhypothesen bewertet und die mit der größten geschätzten Erfolgswahrscheinlichkeit auswählt. Diese wird verwendet, um Objekte mit Hilfe einer reaktiven Kontrollstrategie zu greifen. Die zwei Kernbeiträge der Arbeit sind zum einen die haptische Exploration von unbekannten Objekten und zum anderen das Greifen von unbekannten Objekten mit Hilfe einer neuartigen datengetriebenen Greifmetrik

    More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch

    Full text link
    For humans, the process of grasping an object relies heavily on rich tactile feedback. Most recent robotic grasping work, however, has been based only on visual input, and thus cannot easily benefit from feedback after initiating contact. In this paper, we investigate how a robot can learn to use tactile information to iteratively and efficiently adjust its grasp. To this end, we propose an end-to-end action-conditional model that learns regrasping policies from raw visuo-tactile data. This model -- a deep, multimodal convolutional network -- predicts the outcome of a candidate grasp adjustment, and then executes a grasp by iteratively selecting the most promising actions. Our approach requires neither calibration of the tactile sensors, nor any analytical modeling of contact forces, thus reducing the engineering effort required to obtain efficient grasping policies. We train our model with data from about 6,450 grasping trials on a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger. Across extensive experiments, our approach outperforms a variety of baselines at (i) estimating grasp adjustment outcomes, (ii) selecting efficient grasp adjustments for quick grasping, and (iii) reducing the amount of force applied at the fingers, while maintaining competitive performance. Finally, we study the choices made by our model and show that it has successfully acquired useful and interpretable grasping behaviors.Comment: 8 pages. Published on IEEE Robotics and Automation Letters (RAL). Website: https://sites.google.com/view/more-than-a-feelin

    The implications of embodiment for behavior and cognition: animal and robotic case studies

    Full text link
    In this paper, we will argue that if we want to understand the function of the brain (or the control in the case of robots), we must understand how the brain is embedded into the physical system, and how the organism interacts with the real world. While embodiment has often been used in its trivial meaning, i.e. 'intelligence requires a body', the concept has deeper and more important implications, concerned with the relation between physical and information (neural, control) processes. A number of case studies are presented to illustrate the concept. These involve animals and robots and are concentrated around locomotion, grasping, and visual perception. A theoretical scheme that can be used to embed the diverse case studies will be presented. Finally, we will establish a link between the low-level sensory-motor processes and cognition. We will present an embodied view on categorization, and propose the concepts of 'body schema' and 'forward models' as a natural extension of the embodied approach toward first representations.Comment: Book chapter in W. Tschacher & C. Bergomi, ed., 'The Implications of Embodiment: Cognition and Communication', Exeter: Imprint Academic, pp. 31-5

    Autonomous active exploration for tactile sensing in robotics

    Get PDF
    The sense of touch permits humans to directly touch, feel and perceive the state of their surrounding environment. For an exploration task, humans normally reduce uncertainty by actively moving their hands and fingers towards more interesting locations. This active exploration is a sophisticated procedure that involves sensing and perception processes. In robotics, the sense of touch also plays an important role for the development of intelligent systems capable to safely explore and interact with their environment. However, robust and accurate sensing and perception methods, crucial to exploit the benefits offered by the sense of touch, still represents a major research challenge in the field of robotics. A novel method for sensing and perception in robotics using the sense of touch is developed in this research work. This novel active Bayesian perception method, biologically inspired by humans, demonstrates its superiority over passive perception modality, achieving accurate tactile perception with a biomimetic fingertip sensor. The accurate results are accomplished by the accumulation of evidence through the interaction with the environment, and by actively moving the biomimetic fingertip sensor towards better locations to improve perception as humans do. A contour following exploration, commonly used by humans to extract object shape, was used to validate the proposed method using simulated and real objects. The exploration procedure demonstrated the ability of the tactile sensor to autonomously interact, performing active movements to improve the perception from the contour of the objects being explored, in a natural way as humans do. An investigation of the effects on the perception and decisions taken by the combination of the experience acquired along an exploration task with the active Bayesian perception process is also presented. This investigation, based on two novel sensorimotor control strategies (SMC1 and SMC2), was able to improve the performance in speed and accuracy of the exploration task. To exploit the benefits of the control strategies in a realistic exploration, the learning of a forward model and confidence factor was needed. For that reason, a novel method based on the combination of Predicted Information Gain (PIG) and Dynamic Bayesian Networks (DBN) permitted to achieve an online and adaptive learning of the forward model and confidence factor, allowing to improve the performance of the exploration task for both sensorimotor control strategies. Overall, the novel methods presented in this thesis, validated in simulated and real environments, demonstrated to be robust, accurate and suitable for robots to perform autonomous active perception and exploration using the sense touch
    corecore