3,225 research outputs found

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte

    Improved GelSight Tactile Sensor for Measuring Geometry and Slip

    Full text link
    A GelSight sensor uses an elastomeric slab covered with a reflective membrane to measure tactile signals. It measures the 3D geometry and contact force information with high spacial resolution, and successfully helped many challenging robot tasks. A previous sensor, based on a semi-specular membrane, produces high resolution but with limited geometry accuracy. In this paper, we describe a new design of GelSight for robot gripper, using a Lambertian membrane and new illumination system, which gives greatly improved geometric accuracy while retaining the compact size. We demonstrate its use in measuring surface normals and reconstructing height maps using photometric stereo. We also use it for the task of slip detection, using a combination of information about relative motions on the membrane surface and the shear distortions. Using a robotic arm and a set of 37 everyday objects with varied properties, we find that the sensor can detect translational and rotational slip in general cases, and can be used to improve the stability of the grasp.Comment: IEEE/RSJ International Conference on Intelligent Robots and System

    The role of fingerprints in the coding of tactile information probed with a biomimetic sensor

    Get PDF
    In humans, the tactile perception of fine textures (spatial scale <200 micrometers) is mediated by skin vibrations generated as the finger scans the surface. To establish the relationship between texture characteristics and subcutaneous vibrations, a biomimetic tactile sensor has been designed whose dimensions match those of the fingertip. When the sensor surface is patterned with parallel ridges mimicking the fingerprints, the spectrum of vibrations elicited by randomly textured substrates is dominated by one frequency set by the ratio of the scanning speed to the interridge distance. For human touch, this frequency falls within the optimal range of sensitivity of Pacinian afferents, which mediate the coding of fine textures. Thus, fingerprints may perform spectral selection and amplification of tactile information that facilitate its processing by specific mechanoreceptors.Comment: 25 pages, 11 figures, article + supporting materia

    Deep Thermal Imaging: Proximate Material Type Recognition in the Wild through Deep Learning of Spatial Surface Temperature Patterns

    Get PDF
    We introduce Deep Thermal Imaging, a new approach for close-range automatic recognition of materials to enhance the understanding of people and ubiquitous technologies of their proximal environment. Our approach uses a low-cost mobile thermal camera integrated into a smartphone to capture thermal textures. A deep neural network classifies these textures into material types. This approach works effectively without the need for ambient light sources or direct contact with materials. Furthermore, the use of a deep learning network removes the need to handcraft the set of features for different materials. We evaluated the performance of the system by training it to recognise 32 material types in both indoor and outdoor environments. Our approach produced recognition accuracies above 98% in 14,860 images of 15 indoor materials and above 89% in 26,584 images of 17 outdoor materials. We conclude by discussing its potentials for real-time use in HCI applications and future directions.Comment: Proceedings of the 2018 CHI Conference on Human Factors in Computing System

    Active Tactile Sensing for Texture Perception in Robotic Systems

    Get PDF
    This thesis presents a comprehensive study of tactile sensing, particularly on the prob- lem of active texture perception. It includes a brief introduction to tactile sensing technology and the neural basis for tactile perception. It follows the literature review of textural percep- tion with tactile sensing. I propose a decoding and perception pipeline to tackle fine-texture classification/identification problems via active touching. Experiments are conducted using a 7DOF robotic arm with a finger-shaped tactile sensor mounted on the end-effector to per- form sliding/rubbing movements on multiple fabrics. Low-dimensional frequency features are extracted from the raw signals to form a perceptive feature space, where tactile signals are mapped and segregated into fabric classes. Fabric classes can be parameterized and sim- plified in the feature space using elliptical equations. Results from experiments of varied control parameters are compared and visualized to show that different exploratory move- ments have an apparent impact on the perceived tactile information. It implies the possibil- ity of optimising the robotic movements to improve the textural classification/identification performance

    Hierarchical tactile sensation integration from prosthetic fingertips enables multi-texture surface recognition\u3csup\u3e†\u3c/sup\u3e

    Get PDF
    Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 ± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands
    corecore