265 research outputs found

    Generation of Tactile Data from 3D Vision and Target Robotic Grasps

    Get PDF
    Tactile perception is a rich source of information for robotic grasping: it allows a robot to identify a grasped object and assess the stability of a grasp, among other things. However, the tactile sensor must come into contact with the target object in order to produce readings. As a result, tactile data can only be attained if a real contact is made. We propose to overcome this restriction by employing a method that models the behaviour of a tactile sensor using 3D vision and grasp information as a stimulus. Our system regresses the quantified tactile response that would be experienced if this grasp were performed on the object. We experiment with 16 items and 4 tactile data modalities to show that our proposal learns this task with low error.This work was supported in part by the Spanish Government and the FEDER Funds (BES-2016-078290, PRX19/00289, RTI2018-094279-B-100) and in part by the European Commission (COMMANDIA SOE2/P1/F0638), action supported by Interreg-V Sudoe

    Predicción de la Estabilidad en Tareas de Agarre Robótico con Información Táctil

    Get PDF
    En tareas de manipulación robótica es de especial interés detectar si un agarre es estable o por el contrario, el objeto agarrado se desliza entre los dedos debido a un contacto inadecuado. Con frecuencia, la inestabilidad en el agarre puede ser como consecuencia de una mala pose de la mano o pinza robótica durante su ejecución y o una presión de contacto insuficiente a la hora de ejercer la tarea. El empleo de información táctil y la representación de ésta es vital para llevar a cabo la predicción de estabilidad en el agarre. En este trabajo, se presentan y comparan distintas metodologías para representar la información táctil, así como los métodos de aprendizaje más adecuados en función de la representación táctil escogida.Este trabajo ha sido financiado con Fondos Europeos de Desarrollo Regional (FEDER), Ministerio de Economía, Industria y Competitividad a través del proyecto RTI2018-094279-B-100 y la ayuda predoctoral BES-2016-078290, y también gracias al apoyo de la Comisión Europea y del programa Interreg V. Sudoe a través del proyecto SOE2/P1/F0638

    Tactile-Driven Grasp Stability and Slip Prediction

    Get PDF
    One of the challenges in robotic grasping tasks is the problem of detecting whether a grip is stable or not. The lack of stability during a manipulation operation usually causes the slippage of the grasped object due to poor contact forces. Frequently, an unstable grip can be caused by an inadequate pose of the robotic hand or by insufficient contact pressure, or both. The use of tactile data is essential to check such conditions and, therefore, predict the stability of a grasp. In this work, we present and compare different methodologies based on deep learning in order to represent and process tactile data for both stability and slip prediction.Work funded by the Spanish Ministries of Economy, Industry and Competitiveness and Science, Innovation and Universities through the grant BES-2016-078290 and the project RTI2018-094279-B-100, respectively, as well as the European Commission and FEDER funds through the COMMANDIA project (SOE2/P1/F0638), action supported by Interreg-V Sudoe

    Learning Spatio Temporal Tactile Features with a ConvLSTM for the Direction Of Slip Detection

    Get PDF
    Robotic manipulators have to constantly deal with the complex task of detecting whether a grasp is stable or, in contrast, whether the grasped object is slipping. Recognising the type of slippage—translational, rotational—and its direction is more challenging than detecting only stability, but is simultaneously of greater use as regards correcting the aforementioned grasping issues. In this work, we propose a learning methodology for detecting the direction of a slip (seven categories) using spatio-temporal tactile features learnt from one tactile sensor. Tactile readings are, therefore, pre-processed and fed to a ConvLSTM that learns to detect these directions with just 50 ms of data. We have extensively evaluated the performance of the system and have achieved relatively high results at the detection of the direction of slip on unseen objects with familiar properties (82.56% accuracy).Work funded by the Spanish Ministry of Economy, Industry and Competitiveness, through the project DPI2015-68087-R (predoctoral grant BES-2016-078290) as well as the European Commission and FEDER funds through the COMMANDIA (SOE2/P1/F0638) action supported by Interreg-V Sudoe

    Non-Matrix Tactile Sensors: How Can Be Exploited Their Local Connectivity For Predicting Grasp Stability?

    Get PDF
    Tactile sensors supply useful information during the interaction with an object that can be used for assessing the stability of a grasp. Most of the previous works on this topic processed tactile readings as signals by calculating hand-picked features. Some of them have processed these readings as images calculating characteristics on matrix-like sensors. In this work, we explore how non-matrix sensors (sensors with taxels not arranged exactly in a matrix) can be processed as tactile images as well. In addition, we prove that they can be used for predicting grasp stability by training a Convolutional Neural Network (CNN) with them. We captured over 2500 real three-fingered grasps on 41 everyday objects to train a CNN that exploited the local connectivity inherent on the non-matrix tactile sensors, achieving 94.2% F1-score on predicting stability

    Fast geometry-based computation of grasping points on three-dimensional point clouds

    Get PDF
    Industrial and service robots deal with the complex task of grasping objects that have different shapes and which are seen from diverse points of view. In order to autonomously perform grasps, the robot must calculate where to place its robotic hand to ensure that the grasp is stable. We propose a method to find the best pair of grasping points given a three-dimensional point cloud with the partial view of an unknown object. We use a set of straightforward geometric rules to explore the cloud and propose grasping points on the surface of the object. We then adapt the pair of contacts to a multi-fingered hand used in experimentation. We prove that, after performing 500 grasps of different objects, our approach is fast, taking an average of 17.5 ms to propose contacts, while attaining a grasp success rate of 85.5%. Moreover, the method is sufficiently flexible and stable to work with objects in changing environments, such as those confronted by industrial or service robots.This work was funded by the Spanish Ministry of Economy, Industry and Competitiveness through the project DPI2015-68087-R (pre-doctoral grant BES-2016-078290) as well as the European Commission and FEDER funds through the COMMANDIA project (SOE2/P1/F0638), action supported by Interreg-V Sudoe

    3DCNN Performance in Hand Gesture Recognition Applied to Robot Arm Interaction

    Get PDF
    In the past, methods for hand sign recognition have been successfully tested in Human Robot Interaction (HRI) using traditional methodologies based on static image features and machine learning. However, the recognition of gestures in video sequences is a problem still open, because current detection methods achieve low scores when the background is undefined or in unstructured scenarios. Deep learning techniques are being applied to approach a solution for this problem in recent years. In this paper, we present a study in which we analyse the performance of a 3DCNN architecture for hand gesture recognition in an unstructured scenario. The system yields a score of 73% in both accuracy and F1. The aim of the work is the implementation of a system for commanding robots with gestures recorded by video in real scenarios.This work was funded by the Ministry of Economy, Industry and Competitiveness from the Spanish Government through the DPI2015-68087-R and the pre-doctoral grant BES-2016-078290, by the European Commission and FEDER funds through the project COMMANDIA (SOE2/P1/F0638), action supported by Interreg-V Sudoe

    A Vision-Driven Collaborative Robotic Grasping System Tele-Operated by Surface Electromyography

    Get PDF
    This paper presents a system that combines computer vision and surface electromyography techniques to perform grasping tasks with a robotic hand. In order to achieve a reliable grasping action, the vision-driven system is used to compute pre-grasping poses of the robotic system based on the analysis of tridimensional object features. Then, the human operator can correct the pre-grasping pose of the robot using surface electromyographic signals from the forearm during wrist flexion and extension. Weak wrist flexions and extensions allow a fine adjustment of the robotic system to grasp the object and finally, when the operator considers that the grasping position is optimal, a strong flexion is performed to initiate the grasping of the object. The system has been tested with several subjects to check its performance showing a grasping accuracy of around 95% of the attempted grasps which increases in more than a 13% the grasping accuracy of previous experiments in which electromyographic control was not implemented.This work was funded by the Spanish Government’s Ministry of Economy, Industry and Competitiveness through the DPI2015-68087-R, by the European Commission’s and FEDER funds through the COMMANDIA (SOE2/P1/F0638) action supported by Interreg-V Sudoe and by University of Alicante through project GRE16-20, Control Platform for a Robotic Hand based on Electromyographic Signals

    Spanish Cell Therapy Network (TerCel) : 15 years of successful collaborative translational research

    Get PDF
    In the current article we summarize the 15-year experience of the Spanish Cell Therapy Network (TerCel), a successful collaborative public initiative funded by the Spanish government for the support of nationwide translational research in this important area. Thirty-two research groups organized in three programs devoted to cardiovascular, neurodegenerative and immune-inflammatory diseases, respectively, currently form the network. Each program has three working packages focused on basic science, pre-clinical studies and clinical application. TerCel has contributed during this period to boost the translational research in cell therapy in Spain, setting up a network of Good Manufacturing Practice-certified cell manufacturing facilities- and increasing the number of translational research projects, publications, patents and clinical trials of the participating groups, especially those in collaboration. TerCel pays particular attention to the public-private collaboration, which, for instance, has led to the development of the first allogeneic cell therapy product approved by the European Medicines Agency, Darvadstrocel. The current collaborative work is focused on the development of multicenter phase 2 and 3 trials that could translate these therapies to clinical practice for the benefit of patients
    • …
    corecore