4,692 research outputs found

    TactileGCN: A Graph Convolutional Network for Predicting Grasp Stability with Tactile Sensors

    Get PDF
    Tactile sensors provide useful contact data during the interaction with an object which can be used to accurately learn to determine the stability of a grasp. Most of the works in the literature represented tactile readings as plain feature vectors or matrix-like tactile images, using them to train machine learning models. In this work, we explore an alternative way of exploiting tactile information to predict grasp stability by leveraging graph-like representations of tactile data, which preserve the actual spatial arrangement of the sensor's taxels and their locality. In experimentation, we trained a Graph Neural Network to binary classify grasps as stable or slippery ones. To train such network and prove its predictive capabilities for the problem at hand, we captured a novel dataset of approximately 5000 three-fingered grasps across 41 objects for training and 1000 grasps with 10 unknown objects for testing. Our experiments prove that this novel approach can be effectively used to predict grasp stability

    Multilevel control of an anthropomorphic prosthetic hand for grasp and slip prevention

    Get PDF
    The success of grasping and manipulation tasks of commercial prosthetic hands is mainly related to amputee visual feedback since they are not provided either with tactile sensors or with sophisticated control. As a consequence, slippage and object falls often occur. This article wants to address the specific issue of enhancing grasping and manipulation capabilities of existing prosthetic hands, by changing the control strategy. For this purpose, it proposes a multilevel control based on two distinct levels consisting of (1) a policy search learning algorithm combined with central pattern generators in the higher level and (2) a parallel force/position control managing slippage events in the lower level. The control has been tested on an anthropomorphic robotic hand with prosthetic features (the IH2 hand) equipped with force sensors. Bi-digital and tri-digital grasping tasks with and without slip information have been carried out. The KUKA-LWR has been employed to perturb the grasp stability inducing controlled slip events. The acquired data demonstrate that the proposed control has the potential to adapt to changes in the environment and guarantees grasp stability, by avoiding object fall thanks to prompt slippage event detection
    corecore