30 research outputs found

    Microgeometry capture using an elastomeric sensor

    Get PDF
    We describe a system for capturing microscopic surface geometry. The system extends the retrographic sensor [Johnson and Adelson 2009] to the microscopic domain, demonstrating spatial resolution as small as 2 microns. In contrast to existing microgeometry capture techniques, the system is not affected by the optical characteristics of the surface being measured---it captures the same geometry whether the object is matte, glossy, or transparent. In addition, the hardware design allows for a variety of form factors, including a hand-held device that can be used to capture high-resolution surface geometry in the field. We achieve these results with a combination of improved sensor materials, illumination design, and reconstruction algorithm, as compared to the original sensor of Johnson and Adelson [2009].National Science Foundation (U.S.) (Grant 0739255)National Institutes of Health (U.S.) (Contract 1-R01-EY019292-01

    High Spatial Resolution BRDFs with Metallic powders Using Wave Optics Analysis

    Get PDF
    This manuscript completes the analysis of our SIGGRAPH 2013 paper "Fabricating BRDFs at High Spatial Resolution Using Wave Optics" in which photolithography fabrication was used for manipulating reflectance effects. While photolithography allows for precise reflectance control, it is costly to fabricate. Here we explore an inexpensive alternative to micro-fabrication, in the form of metallic powders. Such powders are readily available at a variety of particle sizes and morphologies. Using an analysis similar to the micro-fabrication paper, we provide guidelines for the relation between the particles' shape and size and the reflectance functions they can produce

    HySenSe: A Hyper-Sensitive and High-Fidelity Vision-Based Tactile Sensor

    Full text link
    In this paper, to address the sensitivity and durability trade-off of Vision-based Tactile Sensor (VTSs), we introduce a hyper-sensitive and high-fidelity VTS called HySenSe. We demonstrate that by solely changing one step during the fabrication of the gel layer of the GelSight sensor (as the most well-known VTS), we can substantially improve its sensitivity and durability. Our experimental results clearly demonstrate the outperformance of the HySenSe compared with a similar GelSight sensor in detecting textural details of various objects under identical experimental conditions and low interaction forces (<= 1.5 N).Comment: Accepted to IEEE Sensors 2022 Conferenc

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte

    GelSlim: A High-Resolution, Compact, Robust, and Calibrated Tactile-sensing Finger

    Full text link
    This work describes the development of a high-resolution tactile-sensing finger for robot grasping. This finger, inspired by previous GelSight sensing techniques, features an integration that is slimmer, more robust, and with more homogeneous output than previous vision-based tactile sensors. To achieve a compact integration, we redesign the optical path from illumination source to camera by combining light guides and an arrangement of mirror reflections. We parameterize the optical path with geometric design variables and describe the tradeoffs between the finger thickness, the depth of field of the camera, and the size of the tactile sensing area. The sensor sustains the wear from continuous use -- and abuse -- in grasping tasks by combining tougher materials for the compliant soft gel, a textured fabric skin, a structurally rigid body, and a calibration process that maintains homogeneous illumination and contrast of the tactile images during use. Finally, we evaluate the sensor's durability along four metrics that track the signal quality during more than 3000 grasping experiments.Comment: RA-L Pre-print. 8 page

    Lump detection with a gelsight sensor

    Get PDF
    A GelSight sensor is a tactile sensing device comprising a clear elastomeric pad covered with a reflective membrane, coupled with optics to measure the membrane's deformations. When the pad is pressed against an object's surface, the membrane changes shape in accord with mechanical and geometrical properties of the object. Since soft tissue is more compliant than hard tissue, one can detect an embedded lump by pressing the GelSight pad against the tissue surface and observing the hump that forms over the lump. We tested this system's sensitivity by constructing phantoms of soft rubber with hard embedded lumps. The system is quite sensitive; for example it could detect a 2mm lump at a depth of 5mm. The sensor was more sensitive than previous tactile lump detectors. It was also better than human observers using their fingertips. Such a capability could help in tumor screening, and could augment the sensory information available in telemedicine or minimally invasive surgery.National Science Foundation (U.S.) (Grant 6922551

    Localization and Manipulation of Small Parts Using GelSight Tactile Sensing

    Get PDF
    Robust manipulation and insertion of small parts can be challenging because of the small tolerances typically involved. The key to robust control of these kinds of manipulation interactions is accurate tracking and control of the parts involved. Typically, this is accomplished using visual servoing or force-based control. However, these approaches have drawbacks. Instead, we propose a new approach that uses tactile sensing to accurately localize the pose of a part grasped in the robot hand. Using a feature-based matching technique in conjunction with a newly developed tactile sensing technology known as GelSight that has much higher resolution than competing methods, we synthesize high-resolution height maps of object surfaces. As a result of these high-resolution tactile maps, we are able to localize small parts held in a robot hand very accurately. We quantify localization accuracy in benchtop experiments and experimentally demonstrate the practicality of the approach in the context of a small parts insertion problem.National Science Foundation (U.S.) (NSF Grant No. 1017862)United States. National Aeronautics and Space Administration (NASA under Grant No. NNX13AQ85G)United States. Office of Naval Research (ONR Grant No. N000141410047

    Tracking objects with point clouds from vision and touch

    Get PDF
    We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified second-order update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot's end effector
    corecore