12 research outputs found

    Tactile Mapping and Localization from High-Resolution Tactile Imprints

    Full text link
    This work studies the problem of shape reconstruction and object localization using a vision-based tactile sensor, GelSlim. The main contributions are the recovery of local shapes from contact, an approach to reconstruct the tactile shape of objects from tactile imprints, and an accurate method for object localization of previously reconstructed objects. The algorithms can be applied to a large variety of 3D objects and provide accurate tactile feedback for in-hand manipulation. Results show that by exploiting the dense tactile information we can reconstruct the shape of objects with high accuracy and do on-line object identification and localization, opening the door to reactive manipulation guided by tactile sensing. We provide videos and supplemental information in the project's website http://web.mit.edu/mcube/research/tactile_localization.html.Comment: ICRA 2019, 7 pages, 7 figures. Website: http://web.mit.edu/mcube/research/tactile_localization.html Video: https://youtu.be/uMkspjmDbq

    3D Shape Perception from Monocular Vision, Touch, and Shape Priors

    Full text link
    Perceiving accurate 3D object shape is important for robots to interact with the physical world. Current research along this direction has been primarily relying on visual observations. Vision, however useful, has inherent limitations due to occlusions and the 2D-3D ambiguities, especially for perception with a monocular camera. In contrast, touch gets precise local shape information, though its efficiency for reconstructing the entire shape could be low. In this paper, we propose a novel paradigm that efficiently perceives accurate 3D object shape by incorporating visual and tactile observations, as well as prior knowledge of common object shapes learned from large-scale shape repositories. We use vision first, applying neural networks with learned shape priors to predict an object's 3D shape from a single-view color image. We then use tactile sensing to refine the shape; the robot actively touches the object regions where the visual prediction has high uncertainty. Our method efficiently builds the 3D shape of common objects from a color image and a small number of tactile explorations (around 10). Our setup is easy to apply and has potentials to help robots better perform grasping or manipulation tasks on real-world objects.Comment: IROS 2018. The first two authors contributed equally to this wor

    Object exploration using vision and active touch

    Get PDF

    Tactile localization: dealing with uncertainty from the first touch

    Get PDF
    En aquesta tesi proposem un nou sistema per localitzar d'objectes amb sensors tàctils per a robòtica de manipulació, que tracta, de forma explícita, la incertesa inherent al sentit del tacte. Amb aquesta fi, estimem la completa distribució de probabilitat de la posició de l'objecte. A més a més, donat el model 3D de l'objecte en qüestió, el nostre sistema no requereix una exploració prèvia de l'objecte amb el sensor, podent localizar-lo des del primer contacte. Donat un senyal provinent del sensor tàctil, dividim l'estimació de la distribució de probabilitat de la posició de l'objecte en dos passos. Primer, abans de tocar l'objecte, definim un conjunt dens de posicions de l'objecte respecte al sensor, simulem el senyal que esperaríem rebre del sensor si l'objecte fos tocat en aquestes posicions, i entrenem una funció de semblança entre aquests senyals. Segon, mentre l'objecte està sent manipulat, comparem el senyal provinent del sensor amb els senyals simulats prèviament, i les semblances entre aquests donen la distribució de probabilitat discreta a l'espai de posicions de l'objecte respecte al sensor. Estenem aquesta feina analitzant l'escenari on múltiples sensors tàctils toquen l'objecte a la vegada. Fusionem les distribucions de probabilitat provinents dels diferents sensors per obtenir una distribució millor. Presentem resultats quantitatius per quatre objectes. També mostrem una aplicació d'aquest sistema en un sistema més gran i presentem recerca en la qual estem treballant actualment en percepció activa.En esta tesis proponemos un nuevos sistema para localizar objetos con sensores táctiles para robótica de manipulación, que trata, de forma explícita, la incertidumbre inherente al sentido del tacto. Con este fin, estimamos la completa distribución de probabilidad de la posición del objeto. Además, dado el modelo 3D del objeto que cuestión, nuestro sistema no requiere una exploración previa del objeto con el sensor, pudiendo localizarlo desde el primer contacto. Dada una señal proveniente del sensor táctil, dividimos la estimación de la distribución de probabilidad de la posición del objeto en dos pasos. Primero, antes de tocar el objeto, definimos un conjunto denso de posiciones del objeto respecto al sensor, simulamos la señal que esperaríamos recibir del sensor si el objeto fuese tocado en estas posiciones, y entrenamos una función de semejanza entre estas señales. Segundo, mientras el objeto está siendo manipulado, comparamos la señal proveniente del sensor con las señales simuladas previamente, y las semejanzas entre estas dan la distribución de probabilidad discreta en el espacio de posiciones del objeto respecto al sensor. Extendemos este trabajo analizando el escenario donde múltiples sensores táctiles tocan el objeto al mismo tiempo. Fusionamos las distribuciones de probabilidad que vienen de los diferentes sensores para obtener una distribución mejor. Presentamos resultados cuantitativos para cuatro objetos. También mostramos una aplicación de este sistema en un sistema más grande y presentamos investigación en la que estamos trabajando actualmente en percepción activa.In this thesis we present an approach to object tactile localization for robotic manipulation which explicitly deals with the uncertainty to overcome the locality of tactile sensing. To that purpose, we estimate full probability distributions of object pose. Moreover, given a 3D model of the object in question, our framework localizes from the first touch, meaning no physical exploration of the object is needed beforehand. Given a signal from the tactile sensor, we divide the estimation of a probability distribution of object pose in two main steps. First, before touching the object, we sample a dense set of poses of the object with respect to the sensor, we simulate the signal the sensor would get when touching the object at these poses, and we train a similarity function between these signals. In the second part, while manipulating the object, we compare the signal coming from the sensor to the set of previously simulated ones, and the similarities between these give the discretized probability distribution over the possible poses of the object with respect to the sensor. We extend this work by analyzing the scenario where multiple tactile sensors are touching the object at the same time, by fusing the probability distributions coming from the individual sensors to get a better distribution. We present quantitative results for four objects. We also present the application of this approach in a larger system and an ongoing research direction towards tactile active perception.Outgoin

    Tactile control based on Gaussian images and its application in bi-manual manipulation of deformable objects

    Get PDF
    The field of in-hand robot manipulation of deformable objects is an open and key issue for the next-coming robots. Developing an adaptable and agile framework for the tasks where a robot grasps and manipulates different kinds of deformable objects, is a main goal in the literature. Many research works have been proposed to control the manipulation tasks using a model of the manipulated object. Despite these techniques are precise to model the deformations, they are time consuming and, using them in real environments is almost impossible because of the large amount of objects which the robot could find. In this paper, we propose a model-independent framework to control the movements of the fingers of the hands while the robot executes manipulation tasks with deformable objects. This technique is based on tactile images which are obtained as a common interface for different tactile sensors, and uses a servo-tactile control to stabilize the grasping points, avoid sliding and adapt the contacts’ configuration regarding to position and magnitude of the applied force. Tactile images are obtained using a combination of dynamic Gaussians, which allows the creation of a common representation for tactile data given by different sensors with different technologies and resolutions. The framework was tested on different manipulation tasks where the objects are deformed, and without using a model of them.Research supported by the Spanish Ministry of Economy, European FEDER funds, Valencia Regional Government and University of Alicante through the projects DPI2015-68087-R, PROMETEO/2013/085 and GRE 15-05. This work has been also supported by the French Government Research Program Investissements d’avenir, through the RobotEx Equipment of Excellence (ANR-10-EQPX-44) and the IMobS3 Laboratory of Excellence (ANR-10-LABX-16-01)
    corecore