7,113 research outputs found

    Improved GelSight Tactile Sensor for Measuring Geometry and Slip

    Full text link
    A GelSight sensor uses an elastomeric slab covered with a reflective membrane to measure tactile signals. It measures the 3D geometry and contact force information with high spacial resolution, and successfully helped many challenging robot tasks. A previous sensor, based on a semi-specular membrane, produces high resolution but with limited geometry accuracy. In this paper, we describe a new design of GelSight for robot gripper, using a Lambertian membrane and new illumination system, which gives greatly improved geometric accuracy while retaining the compact size. We demonstrate its use in measuring surface normals and reconstructing height maps using photometric stereo. We also use it for the task of slip detection, using a combination of information about relative motions on the membrane surface and the shear distortions. Using a robotic arm and a set of 37 everyday objects with varied properties, we find that the sensor can detect translational and rotational slip in general cases, and can be used to improve the stability of the grasp.Comment: IEEE/RSJ International Conference on Intelligent Robots and System

    Dexterous manipulation of unknown objects using virtual contact points

    Get PDF
    The manipulation of unknown objects is a problem of special interest in robotics since it is not always possible to have exact models of the objects with which the robot interacts. This paper presents a simple strategy to manipulate unknown objects using a robotic hand equipped with tactile sensors. The hand configurations that allow the rotation of an unknown object are computed using only tactile and kinematic information, obtained during the manipulation process and reasoning about the desired and real positions of the fingertips during the manipulation. This is done taking into account that the desired positions of the fingertips are not physically reachable since they are located in the interior of the manipulated object and therefore they are virtual positions with associated virtual contact points. The proposed approach was satisfactorily validated using three fingers of an anthropomorphic robotic hand (Allegro Hand), with the original fingertips replaced by tactile sensors (WTS-FT). In the experimental validation, several everyday objects with different shapes were successfully manipulated, rotating them without the need of knowing their shape or any other physical property.Peer ReviewedPostprint (author's final draft

    Tactile feedback display with spatial and temporal resolutions.

    Get PDF
    We report the electronic recording of the touch contact and pressure using an active matrix pressure sensor array made of transparent zinc oxide thin-film transistors and tactile feedback display using an array of diaphragm actuators made of an interpenetrating polymer elastomer network. Digital replay, editing and manipulation of the recorded touch events were demonstrated with both spatial and temporal resolutions. Analog reproduction of the force is also shown possible using the polymer actuators, despite of the high driving voltage. The ability to record, store, edit, and replay touch information adds an additional dimension to digital technologies and extends the capabilities of modern information exchange with the potential to revolutionize physical learning, social networking, e-commerce, robotics, gaming, medical and military applications

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte

    Shear-invariant Sliding Contact Perception with a Soft Tactile Sensor

    Full text link
    Manipulation tasks often require robots to be continuously in contact with an object. Therefore tactile perception systems need to handle continuous contact data. Shear deformation causes the tactile sensor to output path-dependent readings in contrast to discrete contact readings. As such, in some continuous-contact tasks, sliding can be regarded as a disturbance over the sensor signal. Here we present a shear-invariant perception method based on principal component analysis (PCA) which outputs the required information about the environment despite sliding motion. A compliant tactile sensor (the TacTip) is used to investigate continuous tactile contact. First, we evaluate the method offline using test data collected whilst the sensor slides over an edge. Then, the method is used within a contour-following task applied to 6 objects with varying curvatures; all contours are successfully traced. The method demonstrates generalisation capabilities and could underlie a more sophisticated controller for challenging manipulation or exploration tasks in unstructured environments. A video showing the work described in the paper can be found at https://youtu.be/wrTM61-pieUComment: Accepted in ICRA 201

    Development of an intelligent object for grasp and manipulation research

    Get PDF
    KÔiva R, Haschke R, Ritter H. Development of an intelligent object for grasp and manipulation research. Presented at the ICAR 2011, Tallinn, Estonia.In this paper we introduce a novel device, called iObject, which is equipped with tactile and motion tracking sensors that allow for the evaluation of human and robot grasping and manipulation actions. Contact location and contact force, object acceleration in space (6D) and orientation relative to the earth (3D magnetometer) are measured and transmitted wirelessly over a Bluetooth connection. By allowing human-human, human-robot and robot-robot comparisons to be made, iObject is a versatile tool for studying manual interaction. To demonstrate the efficiency and flexibility of iObject for the study of bimanual interactions, we report on a physiological experiment and evaluate the main parameters of the considered dual-handed manipulation task
    • 

    corecore