2,196 research outputs found

    Tactile Sensing with Accelerometers in Prehensile Grippers for Robots

    Full text link
    This is the author’s version of a work that was accepted for publication in Mechatronics. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Mechatronics, Vol. 33, (2016)] DOI 10.1016/j.mechatronics.2015.11.007.Several pneumatic grippers with accelerometers attached to their fingers have been developed and tested. The first gripper is able to classify the hardness of different cylinders, estimate the pneumatic pressure, monitor the position and speed of the gripper fingers, and study the phases of the action of grasping and the influence of the relative position between the gripper and the cylinders. The other grippers manipulate and assess the firmness of eggplants and mangoes. To achieve a gentle manipulation, the grippers employ fingers with several degrees of freedom in different configurations and have a membrane filled with a fluid that allows their hardness to be controlled by means of the jamming transition of the granular fluid inside it. To assess the firmness of eggplants and mangoes and avoid the influence of the relative position between product and gripper, the firmness is estimated while the products are being held by the fingers. Better performance of the accelerometers is achieved when the finger employs the granular fluid. The article presents methods for designing grippers capable of assessing the firmness of irregular products with accelerometers. At the same time, it also studies the possibilities that accelerometers, attached to different pneumatic robot gripper fingers, offer as tactile sensors. (C) 2015 Elsevier Ltd. All rights reserved.This research is supported by the MANI-DACSA project (Grant number RTA2012-00062-C04-02), which is partially funded by the Spanish Government (Ministerio de Economia y Competitividad.).Blanes Campos, C.; Mellado Arteche, M.; Beltrán Beltrán, P. (2016). Tactile Sensing with Accelerometers in Prehensile Grippers for Robots. Mechatronics. 33:1-12. https://doi.org/10.1016/j.mechatronics.2015.11.007S1123

    Probabilistic consolidation of grasp experience

    Get PDF
    We present a probabilistic model for joint representation of several sensory modalities and action parameters in a robotic grasping scenario. Our non-linear probabilistic latent variable model encodes relationships between grasp-related parameters, learns the importance of features, and expresses confidence in estimates. The model learns associations between stable and unstable grasps that it experiences during an exploration phase. We demonstrate the applicability of the model for estimating grasp stability, correcting grasps, identifying objects based on tactile imprints and predicting tactile imprints from object-relative gripper poses. We performed experiments on a real platform with both known and novel objects, i.e., objects the robot trained with, and previously unseen objects. Grasp correction had a 75% success rate on known objects, and 73% on new objects. We compared our model to a traditional regression model that succeeded in correcting grasps in only 38% of cases

    Reliable non-prehensile door opening through the combination of vision, tactile and force feedback

    Get PDF
    Whereas vision and force feedback—either at the wrist or at the joint level—for robotic manipulation purposes has received considerable attention in the literature, the benefits that tactile sensors can provide when combined with vision and force have been rarely explored. In fact, there are some situations in which vision and force feedback cannot guarantee robust manipulation. Vision is frequently subject to calibration errors, occlusions and outliers, whereas force feedback can only provide useful information on those directions that are constrained by the environment. In tasks where the visual feedback contains errors, and the contact configuration does not constrain all the Cartesian degrees of freedom, vision and force sensors are not sufficient to guarantee a successful execution. Many of the tasks performed in our daily life that do not require a firm grasp belong to this category. Therefore, it is important to develop strategies for robustly dealing with these situations. In this article, a new framework for combining tactile information with vision and force feedback is proposed and validated with the task of opening a sliding door. Results show how the vision-tactile-force approach outperforms vision-force and force-alone, in the sense that it allows to correct the vision errors at the same time that a suitable contact configuration is guaranteed.This research was partly supported by the Korea Science and Engineering Foundation under the WCU (World Class University) program funded by the Ministry of Education, Science and Technology, S. Korea (Grant No. R31-2008-000-10062-0), by the European Commission’s Seventh Framework Programme FP7/2007-2013 under grant agreements 217077 (EYESHOTS project), and 248497(TRIDENT Project), by Ministerio de Ciencia e Innovación (DPI-2008-06636; and DPI2008-06548-C03-01), by Fundació Caixa Castelló-Bancaixa (P1-1B2008-51; and P1-1B2009-50) and by Universitat Jaume I

    Data-Driven Grasp Synthesis - A Survey

    Full text link
    We review the work on data-driven grasp synthesis and the methodologies for sampling and ranking candidate grasps. We divide the approaches into three groups based on whether they synthesize grasps for known, familiar or unknown objects. This structure allows us to identify common object representations and perceptual processes that facilitate the employed data-driven grasp synthesis technique. In the case of known objects, we concentrate on the approaches that are based on object recognition and pose estimation. In the case of familiar objects, the techniques use some form of a similarity matching to a set of previously encountered objects. Finally for the approaches dealing with unknown objects, the core part is the extraction of specific features that are indicative of good grasps. Our survey provides an overview of the different methodologies and discusses open problems in the area of robot grasping. We also draw a parallel to the classical approaches that rely on analytic formulations.Comment: 20 pages, 30 Figures, submitted to IEEE Transactions on Robotic

    Neuromorphic Computing Systems for Tactile Sensing Perception

    Get PDF
    Touch sensing plays an important role in humans daily life. Tasks like exploring, grasping and manipulating objects deeply rely on it. As such, Robots and hand prosthesis endowed with the sense of touch can better and more easily manipulate objects, and physically collaborate with other agents. Towards this goal, information about touched objects and surfaces has to be inferred from raw data coming from the sensors. The orientation of edges, which is employed as a pre-processing stage in both artificial vision and touch, is a key indication for object discrimination. Inspired on the encoding of edges in human first-order tactile afferents, we developed a biologically inspired, spiking models architecture that mimics human tactile perception with computational primitives that are implementable on low-power subthreshold neuromorphic hardware. The network architecture uses three layers of Leaky Integrate and Fire neurons to distinguish different edge orientations of a bar pressed on the artificial skin of the iCub robot. We demonstrated that the network architecture can learn the appropriate connectivity through unsupervised spike-based learning, and that the number and spatial distribution of sensitive areas within receptive fields are important in edge orientation discrimination. The unconstrained and random structure of the connectivity among layers can produce unbalanced activity in the output neurons, which are driven by a variable amount of synaptic inputs. We explored two different mechanisms of synaptic normalization (weights normalization and homeostasis), defining how this can be useful during the learning phase and inference phase. The network is successfully able to discriminate between 35 orientations of 36 (0 degree to 180 degree with 5 degree step increments) with homeostasis and weights normalization mechanism. Besides edge orientation discrimination, we modified the network architecture to be able to classify six different touch modalities (e.g. poke, press, grab, squeeze, push, and rolling a wheel). We demonstrated the ability of the network to learn appropriate connectivity patterns for the classification, achieving a total accuracy of 88.3 %. Furthermore, another application scenario on the tactile object shapes recognition has been considered because of its importance in robotic manipulation. We illustrated that the network architecture with 2 layers of spiking neurons was able to discriminate the tactile object shapes with accuracy 100 %, after integrating to it an array of 160 piezoresistive tactile sensors where the object shapes are applied

    Spatially Distributed Tactile Feedback for Kinesthetic Motion Guidance

    Get PDF
    Apraxic stroke patients need to perform repetitive arm movements to regain motor functionality, but they struggle to process the visual feedback provided by typical virtual rehabilitation systems. Instead, we imagine a low cost sleeve that can measure the movement of the upper limb and provide tactile feedback at key locations. The feedback provided by the tactors should guide the patient through a series of desired movements by allowing him or her to feel limb configuration errors at each instant in time. After discussing the relevant motion capture and actuator options, this paper describes the design and programming of our current prototype, a wearable tactile interface that uses magnetic motion tracking and shaftless eccentric mass motors. The sensors and actuators are attached to the sleeve of an athletic shirt with novel plastic caps, which also help focus the vibration on the user\u27s skin. We connect the motors in current drive for improved performance, and we present a full parametric model for their in situ dynamic response (acceleration output given current input)
    • …
    corecore