3,648 research outputs found

    Active contour following to explore object shape with robot touch

    Get PDF
    In this work, we present an active tactile perception approach for contour following based on a probabilistic framework. Tactile data were collected using a biomimetic fingertip sensor. We propose a control architecture that implements a perception-action cycle for the exploratory procedure, which allows the fingertip to react to tactile contact whilst regulating the applied contact force. In addition' the fingertip is actively repositioned to an optimal position to ensure accurate perception. The method is trained off-line and then the testing performed on-line based on contour following around several different test shapes. We then implement object recognition based on the extracted shapes. Our active approach is compared with a passive approach, demonstrating that active perception is necessary for successful contour following and hence shape recognition

    Shear-invariant Sliding Contact Perception with a Soft Tactile Sensor

    Full text link
    Manipulation tasks often require robots to be continuously in contact with an object. Therefore tactile perception systems need to handle continuous contact data. Shear deformation causes the tactile sensor to output path-dependent readings in contrast to discrete contact readings. As such, in some continuous-contact tasks, sliding can be regarded as a disturbance over the sensor signal. Here we present a shear-invariant perception method based on principal component analysis (PCA) which outputs the required information about the environment despite sliding motion. A compliant tactile sensor (the TacTip) is used to investigate continuous tactile contact. First, we evaluate the method offline using test data collected whilst the sensor slides over an edge. Then, the method is used within a contour-following task applied to 6 objects with varying curvatures; all contours are successfully traced. The method demonstrates generalisation capabilities and could underlie a more sophisticated controller for challenging manipulation or exploration tasks in unstructured environments. A video showing the work described in the paper can be found at https://youtu.be/wrTM61-pieUComment: Accepted in ICRA 201

    Realtime State Estimation with Tactile and Visual sensing. Application to Planar Manipulation

    Full text link
    Accurate and robust object state estimation enables successful object manipulation. Visual sensing is widely used to estimate object poses. However, in a cluttered scene or in a tight workspace, the robot's end-effector often occludes the object from the visual sensor. The robot then loses visual feedback and must fall back on open-loop execution. In this paper, we integrate both tactile and visual input using a framework for solving the SLAM problem, incremental smoothing and mapping (iSAM), to provide a fast and flexible solution. Visual sensing provides global pose information but is noisy in general, whereas contact sensing is local, but its measurements are more accurate relative to the end-effector. By combining them, we aim to exploit their advantages and overcome their limitations. We explore the technique in the context of a pusher-slider system. We adapt iSAM's measurement cost and motion cost to the pushing scenario, and use an instrumented setup to evaluate the estimation quality with different object shapes, on different surface materials, and under different contact modes

    Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

    Get PDF
    Humans use information from sensory predictions, together with current observations, for the optimal exploration and recognition of their surrounding environment. In this work, two novel adaptive perception strategies are proposed for accurate and fast exploration of object shape with a robotic tactile sensor. These strategies called (1) adaptive weighted prior and (2) adaptive weighted posterior, combine tactile sensory predictions and current sensor observations to autonomously adapt the accuracy and speed of active Bayesian perception in object exploration tasks. Sensory predictions, obtained from a forward model, use a novel Predicted Information Gain method. These predictions are used by the tactile robot to analyse ‘what would have happened’ if certain decisions ‘would have been made’ at previous decision times. The accuracy of predictions is evaluated and controlled by a confidence parameter, to ensure that the adaptive perception strategies rely more on predictions when they are accurate, and more on current sensory observations otherwise. This work is systematically validated with the recognition of angle and position data extracted from the exploration of object shape, using a biomimetic tactile sensor and a robotic platform. The exploration task implements the contour following procedure used by humans to extract object shape with the sense of touch. The validation process is performed with the adaptive weighted strategies and active perception alone. The adaptive approach achieved higher angle accuracy (2.8 deg) over active perception (5 deg). The position accuracy was similar for all perception methods (0.18 mm). The reaction time or number of tactile contacts, needed by the tactile robot to make a decision, was improved by the adaptive perception (1 tap) over active perception (5 taps). The results show that the adaptive perception strategies can enable future robots to adapt their performance, while improving the trade-off between accuracy and reaction time, for tactile exploration, interaction and recognition tasks

    Tactile Mapping and Localization from High-Resolution Tactile Imprints

    Full text link
    This work studies the problem of shape reconstruction and object localization using a vision-based tactile sensor, GelSlim. The main contributions are the recovery of local shapes from contact, an approach to reconstruct the tactile shape of objects from tactile imprints, and an accurate method for object localization of previously reconstructed objects. The algorithms can be applied to a large variety of 3D objects and provide accurate tactile feedback for in-hand manipulation. Results show that by exploiting the dense tactile information we can reconstruct the shape of objects with high accuracy and do on-line object identification and localization, opening the door to reactive manipulation guided by tactile sensing. We provide videos and supplemental information in the project's website http://web.mit.edu/mcube/research/tactile_localization.html.Comment: ICRA 2019, 7 pages, 7 figures. Website: http://web.mit.edu/mcube/research/tactile_localization.html Video: https://youtu.be/uMkspjmDbq

    Angle and position perception for exploration with active touch

    Get PDF
    Over the past few decades the design of robots has gradually improved, allowing them to perform complex tasks in interaction with the world. To behave appropriately, robots need to make perceptual decisions about their environment using their various sensory modalities. Even though robots are being equipped with progressively more accurate and advanced sensors, dealing with uncertainties from the world and their sensory processes remains an unavoidable necessity for autonomous robotics. The challenge is to develop robust methods that allow robots to perceive their environment while managing uncertainty and optimizing their decision making. These methods can be inspired by the way humans and animals actively direct their senses towards locations for reducing uncertainties from perception [1]. For instance, humans not only use their hands and fingers for exploration and feature extraction but also their movements are guided according to what it is being perceived [2]. This behaviour is also present in the animal kingdom, such as rats that actively explore the environment by appropriately moving their whiskers [3]. © 2013 Springer-Verlag Berlin Heidelberg

    Exploiting Sensor Symmetry for Generalized Tactile Perception in Biomimetic Touch

    Get PDF

    Sensing and describing 3-D structure

    Get PDF
    Discovering the three dimensional structure of an object is important for a variety of robot tasks. Single sensor systems such as machine vision systems cannot reliably compute three dimensional structure in unconstrained environments. Active, exploratory tactile sensing can be used to complement passive stereo vision data to derive robust surface and feature descriptions of objects. The control for tactile sensing is provided by the vision system which provides regions of interest that the tactile system can explore. The descriptions of surfaces and features are accurate and can be used in a later matching phase against a model data base of objects to identify the object and its position and orientation in space
    • …
    corecore