418 research outputs found

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte

    Proprioceptive Learning with Soft Polyhedral Networks

    Full text link
    Proprioception is the "sixth sense" that detects limb postures with motor neurons. It requires a natural integration between the musculoskeletal systems and sensory receptors, which is challenging among modern robots that aim for lightweight, adaptive, and sensitive designs at a low cost. Here, we present the Soft Polyhedral Network with an embedded vision for physical interactions, capable of adaptive kinesthesia and viscoelastic proprioception by learning kinetic features. This design enables passive adaptations to omni-directional interactions, visually captured by a miniature high-speed motion tracking system embedded inside for proprioceptive learning. The results show that the soft network can infer real-time 6D forces and torques with accuracies of 0.25/0.24/0.35 N and 0.025/0.034/0.006 Nm in dynamic interactions. We also incorporate viscoelasticity in proprioception during static adaptation by adding a creep and relaxation modifier to refine the predicted results. The proposed soft network combines simplicity in design, omni-adaptation, and proprioceptive sensing with high accuracy, making it a versatile solution for robotics at a low cost with more than 1 million use cycles for tasks such as sensitive and competitive grasping, and touch-based geometry reconstruction. This study offers new insights into vision-based proprioception for soft robots in adaptive grasping, soft manipulation, and human-robot interaction.Comment: 20 pages, 10 figures, 2 tables, submitted to the International Journal of Robotics Research for revie

    Objekt-Manipulation und Steuerung der Greifkraft durch Verwendung von Taktilen Sensoren

    Get PDF
    This dissertation describes a new type of tactile sensor and an improved version of the dynamic tactile sensing approach that can provide a regularly updated and accurate estimate of minimum applied forces for use in the control of gripper manipulation. The pre-slip sensing algorithm is proposed and implemented into two-finger robot gripper. An algorithm that can discriminate between types of contact surface and recognize objects at the contact stage is also proposed. A technique for recognizing objects using tactile sensor arrays, and a method based on the quadric surface parameter for classifying grasped objects is described. Tactile arrays can recognize surface types on contact, making it possible for a tactile system to recognize translation, rotation, and scaling of an object independently.Diese Dissertation beschreibt eine neue Art von taktilen Sensoren und einen verbesserten Ansatz zur dynamischen Erfassung von taktilen daten, der in regelmĂ€ĂŸigen ZeitabstĂ€nden eine genaue Bewertung der minimalen Greifkraft liefert, die zur Steuerung des Greifers nötig ist. Ein Berechnungsverfahren zur Voraussage des Schlupfs, das in einen Zwei-Finger-Greifarm eines Roboters eingebaut wurde, wird vorgestellt. Auch ein Algorithmus zur Unterscheidung von verschiedenen OberflĂ€chenarten und zur Erkennung von Objektformen bei der BerĂŒhrung wird vorgestellt. Ein Verfahren zur Objekterkennung mit Hilfe einer Matrix aus taktilen Sensoren und eine Methode zur Klassifikation ergriffener Objekte, basierend auf den Daten einer rechteckigen OberflĂ€che, werden beschrieben. Mit Hilfe dieser Matrix können unter schiedliche Arten von OberflĂ€chen bei BerĂŒhrung erkannt werden, was es fĂŒr das Tastsystem möglich macht, Verschiebung, Drehung und GrĂ¶ĂŸe eines Objektes unabhĂ€ngig voneinander zu erkennen

    Doctor of Philosophy

    Get PDF
    dissertationTactile sensors are a group of sensors that are widely being developed for transduction of touch, force and pressure in the field of robotics, contact sensing and gait analysis. These sensors are employed to measure and register interactions between contact surfaces and the surrounding environment. Since these sensors have gained usage in the field of robotics and gait analysis, there is a need for these sensors to be ultra flexible, highly reliable and capable of measuring pressure and two-axial shear simultaneously. The sensors that are currently available are not capable of achieving all the aforementioned qualities. The goal of this work is to design and develop such a flexible tactile sensor array based on a capacitive sensing scheme and we call it the flexible tactile imager (FTI). The developed design can be easily multiplexed into a high-density array of 676 multi-fingered capacitors that are capable of measuring pressure and two-axial shear simultaneously while maintaining sensor flexibility and reliability. The sensitivity of normal and shear stress for the FTI are 0.74/MPa and 79.5/GPa, respectively, and the resolvable displacement and velocity are as low as 60 ”m and 100 ”m/s, respectively. The developed FTI demonstrates the ability to detect pressure and shear contours of objects rolling on top of it and capability to measure microdisplacement and microvelocities that are desirable during gait analysis

    More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch

    Full text link
    For humans, the process of grasping an object relies heavily on rich tactile feedback. Most recent robotic grasping work, however, has been based only on visual input, and thus cannot easily benefit from feedback after initiating contact. In this paper, we investigate how a robot can learn to use tactile information to iteratively and efficiently adjust its grasp. To this end, we propose an end-to-end action-conditional model that learns regrasping policies from raw visuo-tactile data. This model -- a deep, multimodal convolutional network -- predicts the outcome of a candidate grasp adjustment, and then executes a grasp by iteratively selecting the most promising actions. Our approach requires neither calibration of the tactile sensors, nor any analytical modeling of contact forces, thus reducing the engineering effort required to obtain efficient grasping policies. We train our model with data from about 6,450 grasping trials on a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger. Across extensive experiments, our approach outperforms a variety of baselines at (i) estimating grasp adjustment outcomes, (ii) selecting efficient grasp adjustments for quick grasping, and (iii) reducing the amount of force applied at the fingers, while maintaining competitive performance. Finally, we study the choices made by our model and show that it has successfully acquired useful and interpretable grasping behaviors.Comment: 8 pages. Published on IEEE Robotics and Automation Letters (RAL). Website: https://sites.google.com/view/more-than-a-feelin

    Development and Integration of Tactile Sensing System

    Get PDF
    To grasp and manipulate complex objects, robots require information about the interaction between the end effector and the object. This work describes the integration of a low-cost 3-axis tactile sensing system into two different robotic systems and the measurement of some of these complex interactions. The sensor itself is small, lightweight, and compliant so that it can be integrated within a variety of end effectors and locations on those end effectors (e.g. wrapped around a finger). To improve usability and data collection, a custom interface board and ROS (Robot Operating System) package were developed to read the sensor data and interface with the robots and grippers. Sensor data has been collected from four different tasks: 1. pick and place of non-conductive and conductive objects, 2. wrist-based manipulation, 3. peeling tape, and 4. human interaction with a grasped object. In the last task, a closed loop controller is used to adjust the grip force on the grasped object while the human interacts with it

    Tactile force-sensing for dynamic gripping using piezoelectric force- sensors

    Get PDF
    Thesis (M. Tech.) -- Central University of Technology, Free State, 200

    Tactile Sensing for Robotic Applications

    Get PDF
    This chapter provides an overview of tactile sensing in robotics. This chapter is an attempt to answer three basic questions: \u2022 What is meant by Tactile Sensing? \u2022 Why Tactile Sensing is important? \u2022 How Tactile Sensing is achieved? The chapter is organized to sequentially provide the answers to above basic questions. Tactile sensing has often been considered as force sensing, which is not wholly true. In order to clarify such misconceptions about tactile sensing, it is defined in section 2. Why tactile section is important for robotics and what parameters are needed to be measured by tactile sensors to successfully perform various tasks, are discussed in section 3. An overview of `How tactile sensing has been achieved\u2019 is given in section 4, where a number of technologies and transduction methods, that have been used to improve the tactile sensing capability of robotic devices, are discussed. Lack of any tactile analog to Complementary Metal Oxide Semiconductor (CMOS) or Charge Coupled Devices (CCD) optical arrays has often been cited as one of the reasons for the slow development of tactile sensing vis-\ue0-vis other sense modalities like vision sensing. Our own contribution \u2013 development of tactile sensing arrays using piezoelectric polymers and involving silicon micromachining - is an attempt in the direction of achieving tactile analog of CMOS optical arrays. The first phase implementation of these tactile sensing arrays is discussed in section 5. Section 6 concludes the chapter with a brief discussion on the present status of tactile sensing and the challenges that remain to be solved
    • 

    corecore