9 research outputs found

    Minsight: A Fingertip-Sized Vision-Based Tactile Sensor for Robotic Manipulation

    Full text link
    Intelligent interaction with the physical world requires perceptual abilities beyond vision and hearing; vibrant tactile sensing is essential for autonomous robots to dexterously manipulate unfamiliar objects or safely contact humans. Therefore, robotic manipulators need high-resolution touch sensors that are compact, robust, inexpensive, and efficient. The soft vision-based haptic sensor presented herein is a miniaturized and optimized version of the previously published sensor Insight. Minsight has the size and shape of a human fingertip and uses machine learning methods to output high-resolution maps of 3D contact force vectors at 60 Hz. Experiments confirm its excellent sensing performance, with a mean absolute force error of 0.07 N and contact location error of 0.6 mm across its surface area. Minsight's utility is shown in two robotic tasks on a 3-DoF manipulator. First, closed-loop force control enables the robot to track the movements of a human finger based only on tactile data. Second, the informative value of the sensor output is shown by detecting whether a hard lump is embedded within a soft elastomer with an accuracy of 98%. These findings indicate that Minsight can give robots the detailed fingertip touch sensing needed for dexterous manipulation and physical human-robot interaction

    F-TOUCH Sensor: Concurrent Geometry Per-ception and Multi-axis Force Measurement

    Get PDF

    A bio-inspired multi-functional tendon-driven tactile sensor and application in obstacle avoidance using reinforcement learning

    Get PDF
    This paper presents a new bio-inspired tactile sensor that is multi-functional and has different sensitivity contact areas. The TacTop area is sensitive and is used for object classification when there is a direct contact. On the other hand, the TacSide area is less sensitive and is used to localize the side contact areas. By connecting tendons from the TacSide area to the TacTop area, the sensor is able to perform multiple detection functions using the same expression region. For the mixed contacting signals collected from the expression region with numerous markers and pins, we build a modified DenseNet121 network which specifically removes all fully connected layers and keeps the rest as a sub-network. The proposed model also contains a global average pooling layer with two branching networks to handle different functions and provide accurate spatial translation of the extracted features. The experimental results demonstrate a high prediction accuracy of 98% for object perception and localization. Furthermore, the new tactile sensor is utilized for obstacle avoidance, where action skills are extracted from human demonstrations and then an action dataset is generated for reinforcement learning to guide robots towards correct responses after contact detection. To evaluate the effectiveness of the proposed framework, several simulations are performed in the MuJoCo environment

    Dense Tactile Force Estimation using GelSlim and inverse FEM

    No full text
    In this paper, we present a new version of tactile sensor GelSlim 2.0 with the capability to estimate the contact force distribution in real time. The sensor is vision-based and uses an array of markers to track deformations on a gel pad due to contact. A new hardware design makes the sensor more rugged, parametrically adjusTable AND Improves illumination. leveraging the sensor's increased functionality, we propose to use inverse finite element method (ifem), a numerical method to reconstruct the contact force distribution based on marker displacements. the sensor is able to provide force distribution of contact with high spatial density. experiments and comparison with ground truth show that the reconstructed force distribution is physically reasonable with good accuracy.A sequence of Kendama manipulations with corresponding displacement field (yellow) and force field (red). Video can be found on Youtube: https://youtu.be/hWw9A0ZBZuU
    corecore