32 research outputs found

    Color-Coded Fiber-Optic Tactile Sensor for an Elastomeric Robot Skin

    Full text link
    The sense of touch is essential for reliable mapping between the environment and a robot which interacts physically with objects. Presumably, an artificial tactile skin would facilitate safe interaction of the robots with the environment. In this work, we present our color-coded tactile sensor, incorporating plastic optical fibers (POF), transparent silicone rubber and an off-the-shelf color camera. Processing electronics are placed away from the sensing surface to make the sensor robust to harsh environments. Contact localization is possible thanks to the lower number of light sources compared to the number of camera POFs. Classical machine learning techniques and a hierarchical classification scheme were used for contact localization. Specifically, we generated the mapping from stimulation to sensation of a robotic perception system using our sensor. We achieved a force sensing range up to 18 N with the force resolution of around 3.6~N and the spatial resolution of 8~mm. The color-coded tactile sensor is suitable for tactile exploration and might enable further innovations in robust tactile sensing.Comment: Presented at ICRA2019, Montrea

    iCLAP: Shape Recognition by Combining Proprioception and Touch Sensing

    Get PDF
    The work presented in this paper was partially supported by the Engineering and Physical Sciences Council (EPSRC) Grant (Ref: EP/N020421/1) and the King’s-China Scholarship Council Ph.D. scholarship

    Visuospatial Integration: Paleoanthropological and Archaeological Perspectives

    Get PDF
    The visuospatial system integrates inner and outer functional processes, organizing spatial, temporal, and social interactions between the brain, body, and environment. These processes involve sensorimotor networks like the eye–hand circuit, which is especially important to primates, given their reliance on vision and touch as primary sensory modalities and the use of the hands in social and environmental interactions. At the same time, visuospatial cognition is intimately connected with memory, self-awareness, and simulation capacity. In the present article, we review issues associated with investigating visuospatial integration in extinct human groups through the use of anatomical and behavioral data gleaned from the paleontological and archaeological records. In modern humans, paleoneurological analyses have demonstrated noticeable and unique morphological changes in the parietal cortex, a region crucial to visuospatial management. Archaeological data provides information on hand–tool interaction, the spatial behavior of past populations, and their interaction with the environment. Visuospatial integration may represent a critical bridge between extended cognition, self-awareness, and social perception. As such, visuospatial functions are relevant to the hypothesis that human evolution is characterized by changes in brain–body–environment interactions and relations, which enhance integration between internal and external cognitive components through neural plasticity and the development of a specialized embodiment capacity. We therefore advocate the investigation of visuospatial functions in past populations through the paleoneurological study of anatomical elements and archaeological analysis of visuospatial behaviors

    Symbiotic human-robot collaborative assembly

    Get PDF

    Commande de bras de robot dextrose conduit par le toucher

    No full text
    Robots have improved industry processes, most recognizably in conveyor-belt assemblysystems, and have the potential to bring even more benefits to our society in transportation,exploration of dangerous zones, deep sea or even other planets, health care and inour everyday life. A major barrier to their escape from fenced industrial areas to environmentsco-shared with humans is their poor skills in physical interaction tasks, includingmanipulation of objects. While the dexterity in manipulation is not affected by the blindnessin humans, it dramatically decreases in robots. With no visual perception, robotoperations are limited to static environments, whereas the real world is a highly variantenvironment.In this thesis, we propose a different approach that considers controlling contact betweena robot and the environment during physical interactions. However, current physicalinteraction control approaches are poor in terms of the range of tasks that can beperformed. To allow robots to perform more tasks, we derive tactile features representingdeformations of the mechanically compliant sensing surface of a tactile sensor andincorporate these features to a robot controller via touch-dependent and task-dependenttactile feature mapping matrices.As a first contribution, we show how image processing algorithms can be used todiscover the underlying three dimensional structure of a contact frame between an objectand an array of pressure sensing elements with a mechanically compliant surfaceattached onto a robot arm’s end-effector interacting with this object. These algorithmsobtain as outputs the so-called tactile features. As a second contribution, we design a tactileservoing controller that combines these tactile features with a position/torque controllerof the robot arm. It allows the end-effector of the arm to steer the contact frame ina desired manner by regulating errors in these features. Finally, as a last contribution, weextend this controller by adding a task description layer to address four common issuesin robotics: exploration, manipulation, recognition, and co-manipulation of objects.Throughout this thesis, we make emphasis on developing algorithms that work notonly with simulated robots but also with real ones. Thus, all these contributions havebeen evaluated in experiments conducted with at least one real robot. In general, thiswork aims to provide the robotics community with a unified framework to that will allowrobot arms to be more dexterous and autonomous. Preliminary works are proposedfor extending this framework to perform tasks that involve multicontact control withmultifingered robot hands.Les robots ont amélioré les industries, en particulier les systèmes d'assemblage basé sur des conveyors et ils ont le potentiel pour apporter plus de bénéfices: transports; exploration de zones dangereuses, mer profonde et même d'autres planètes; santé et dans la vie courante.Une barrière majeure pour leur évasion des environnements industriels avec des enceintes vers des environnements partagés avec les humains, c'est leur capacité réduite dans les tâches d’interaction physique, inclue la manipulation d'objets.Tandis que la dextérité de la manipulation n'est pas affectée par la cécité dans les humains, elle décroit énormément pour les robots: ils sont limités à des environnements statiques, mais le monde réel est très changeant. Dans cette thèse, nous proposons une approche différente qui considère le contrôle du contact pendant les interaction physiques entre un robot et l'environnement.Néanmoins, les approches actuelles pour l'interaction physique sont pauvres par rapport au numéro de tâches qu'elles peuvent exécuter. Pour permettre aux robots d'exécuter plus de tâches, nous dérivons des caractéristiques tactiles représentant les déformations de la surface molle d'un capteur tactile et nous incorporons ces caractéristiques dans le contrôleur d'un robot à travers des matrices de mapping tactile basées sur les informations tactiles et sur les tâches à développer.Dans notre première contribution, nous montrons comment les algorithmes de traitement d'images peuvent être utilisés pour découvrir la structure tridimensionnelle subjacente du repère de contact entre un objet et une matrice de capteurs de pression avec une surface molle attachée à l’effecteur d'un bras robotique qui interagit avec cet objet. Ces algorithmes obtiennent comme sorties les soi-disant caractéristiques tactiles. Dans notre deuxième contribution, nous avons conçu un contrôleur qui combine ces caractéristiques tactiles avec un contrôleur position-couple du bras robotique.Il permet à l'effecteur du bras déplacer le repère du contact d'une manière désirée à travers la régulation d'une erreur dans ces caractéristiques. Finalement, dans notre dernière contribution,avec l'addition d'une couche de description des tâches, nous avons étendu ce contrôleur pour adresser quatre problèmes communs dans la robotique: exploration, manipulation, reconnaissance et co-manipulation d'objets.Tout au long de cette thèse, nous avons mis l'accent sur le développement d'algorithmes qui marchent pas simplement avec des robots simulés mais aussi avec de robots réels. De cette manière, toutes ces contributions ont été évaluées avec des expériences faites avec au moins un robot réel. En général, ce travail a comme objectif de fournir à la communauté robotique un cadre unifié qui permet aux bras robotique d'être plus dextres et autonomes. Des travaux préliminaires ont été proposés pour étendre ce cadre au développement de tâches qui impliquent un contrôle multi-contact avec des mains robotiques multi-doigts

    Tactile sensing in dexterous robot hands – review

    No full text
    Tactile sensing is an essential element of autonomous dexterous robot hand manipulation. It provides information about forces of interaction and surface properties at points of contact between the robot fingers and the objects. Recent advancements in robot tactile sensing led to development of many computational techniques that exploit this important sensory channel. This paper reviews current state-of-the-art of manipulation and grasping applications that involve artificial sense of touch and discusses pros and cons of each technique. The main issues of artificial tactile sensing are addressed. General requirements of a tactile sensor are briefly discussed and the main transduction technologies are analyzed. Twenty eight various tactile sensors, each integrated into a robot hand, are classified in accordance with their transduction types and applications. Previously issued reviews are focused on hardware part of tactile sensors, whereas we present an overview of algorithms and tactile feedback-based control systems that exploit signals from the sensors. The applications of these algorithms include grasp stability estimation, tactile object recognition, tactile servoing and force control. Drawing from advancements in tactile sensing technology and taking into consideration its drawbacks, this paper outlines possible new directions of research in dexterous manipulatio

    VibroTouch: Active Tactile Sensor for Contact Detection and Force Sensing via Vibrations

    No full text
    Accurate and fast contact detection between a robot manipulator and objects is crucial for safe robot–object and human–robot interactions. Traditional collision detection techniques relied on force–torque sensors and Columb friction cone estimation. However, the strain gauges used in the conventional force sensors require low-noise and high-precision electronics to deliver the signal to the final user. The Signal-to-Noise Ratio (SNR) in these devices is still an issue in light contact detection. On the other hand, the Eccentric Rotating Mass (ERM) motors are very sensitive to subtle touch as their vibrating resonant state loses immediately. The vibration, in this case, plays a core role in triggering the tactile event. This project’s primary goal is to use generated and received vibrations to establish the scope of object properties that can be obtained through low-frequency generation on one end and Fourier analysis of the accelerometer data on the other end. The main idea behind the system is the phenomenon of change in vibration propagation patterns depending on the grip properties. Moreover, the project’s original aim is to gather enough information on vibration feedback on objects of various properties and compare them. These data sets are further analyzed in terms of frequency and applied grip force correlations in order to prepare the ground for pattern extraction and recognition based on the physical properties of an object

    DEEP VIBRO-TACTILE PERCEPTION FOR SIMULTANEOUS TEXTURE IDENTIFICATION, SLIP DETECTION, AND SPEED ESTIMATION

    No full text
    Autonomous dexterous manipulation relies on the ability to recognize an object and detect its slippage. Dynamic tactile signals are important for object recognition and slip detection. An object can be identified based on the acquired signals generated at contact points during tactile interaction. The use of vibrotactile sensors can increase the accuracy of texture recognition and preempt the slippage of a grasped object. In this work, we present a Deep Learning (DL) based method for the simultaneous texture recognition and slip detection. The method detects non-slip and slip events, the velocity, and discriminate textures—all within 17 ms. We evaluate the method for three objects grasped using an industrial gripper with accelerometers installed on its fingertips. A comparative analysis of convolutional neural networks (CNNs), feed-forward neural networks, and long short-term memory networks confirmed that deep CNNs have a higher generalization accuracy. We also evaluated the performance of the highest accuracy method for different signal bandwidths, which showed that a bandwidth of 125 Hz is enough to classify textures with 80% accuracy

    Tactile sensing in dexterous robot hands — Review

    No full text
    International audienceTactile sensing is an essential element of autonomous dexterous robot hand manipulation. It provides information about forces of interaction and surface properties at points of contact between the robot fingers and the objects. Recent advancements in robot tactile sensing led to development of many computational techniques that exploit this important sensory channel. This paper reviews current state-of-the-art of manipulation and grasping applications that involve artificial sense of touch and discusses pros and cons of each technique. The main issues of artificial tactile sensing are addressed. General requirements of a tactile sensor are briefly discussed and the main transduction technologies are analyzed. Twenty eight various tactile sensors, each integrated into a robot hand, are classified in accordance with their transduction types and applications. Previously issued reviews are focused on hardware part of tactile sensors, whereas we present an overview of algorithms and tactile feedback-based control systems that exploit signals from the sensors. The applications of these algorithms include grasp stability estimation, tactile object recognition, tactile servoing and force control. Drawing from advancements in tactile sensing technology and taking into consideration its drawbacks, this paper outlines possible new directions of research in dexterous manipulation

    Touch driven controller and tactile features for physical interactions

    No full text
    International audienceWe propose an approach that considers controlling contact between a robot and the environment during physical interactions. Current physical interaction control approaches are limited in terms of the range of tasks that can be performed. To allow robots to perform more tasks, we derive tactile features representing deformations of the mechanically compliant sensing surface of a tactile sensor and incorporate these features to a robot controller, akin to a visual servo, via touch-and task-dependent tactile feature mapping matrices. As a first contribution, we derive tactile features to localize a contact coordinate frame between an object and an array of pressure sensing elements, with a mechanically compliant surface, attached onto a robot arm end-effector interacting with the object. As a second contribution, we propose tactile projection matrices to design a tactile servoing controller that combines these tactile features with a Cartesian impedance controller of the robot arm. These matrices convert the proposed tactile features to balance not only normal forces but also torques about the sensor's axes. It allows the end-effector to steer the contact frame in a desired manner by regulating errors in the tactile features to address several common issues in robotics: exploration and co-manipulation
    corecore