3,851 research outputs found

    Multimodal human hand motion sensing and analysis - a review

    Get PDF

    Vision- and tactile-based continuous multimodal intention and attention recognition for safer physical human-robot interaction

    Full text link
    Employing skin-like tactile sensors on robots enhances both the safety and usability of collaborative robots by adding the capability to detect human contact. Unfortunately, simple binary tactile sensors alone cannot determine the context of the human contact -- whether it is a deliberate interaction or an unintended collision that requires safety manoeuvres. Many published methods classify discrete interactions using more advanced tactile sensors or by analysing joint torques. Instead, we propose to augment the intention recognition capabilities of simple binary tactile sensors by adding a robot-mounted camera for human posture analysis. Different interaction characteristics, including touch location, human pose, and gaze direction, are used to train a supervised machine learning algorithm to classify whether a touch is intentional or not with an F1-score of 86%. We demonstrate that multimodal intention recognition is significantly more accurate than monomodal analyses with the collaborative robot Baxter. Furthermore, our method can also continuously monitor interactions that fluidly change between intentional or unintentional by gauging the user's attention through gaze. If a user stops paying attention mid-task, the proposed intention and attention recognition algorithm can activate safety features to prevent unsafe interactions. We also employ a feature reduction technique that reduces the number of inputs to five to achieve a more generalized low-dimensional classifier. This simplification both reduces the amount of training data required and improves real-world classification accuracy. It also renders the method potentially agnostic to the robot and touch sensor architectures while achieving a high degree of task adaptability.Comment: 11 pages, 8 figures, preprint under revie

    Tactile sensors for robot handling

    Get PDF
    First and second generation robots have been used cost effectively in high‐volume ‘fixed’ or ‘hard’ automated manufacturing/assembly systems. They are ‘limited‐ability’ devices using simple logic elements or primitive sensory feedback. However, in the unstructured environment of most manufacturing plants it is often necessary to locate, identify, orientate and position randomly presented components. Visual systems have been researched and developed to provide a coarse resolution outline of objects. More detailed and precise definition of parts is usually obtained by high resolution tactile sensing arrays. This paper reviews and discusses the current state of the art in tactile sensing

    Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors

    Full text link
    Despite the utility of tactile information, tactile sensors have yet to be widely deployed in industrial robotics settings -- part of the challenge lies in identifying slip and other key events from the tactile data stream. In this paper, we present a learning-based method to detect slip using barometric tactile sensors. Although these sensors have a low resolution, they have many other desirable properties including high reliability and durability, a very slim profile, and a low cost. We are able to achieve slip detection accuracies of greater than 91% while being robust to the speed and direction of the slip motion. Further, we test our detector on two robot manipulation tasks involving common household objects and demonstrate successful generalization to real-world scenarios not seen during training. We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for complex manipulation tasks such as slip compensation.Comment: Submitted to th RoboTac Workshop in the IEEE/RSJ International Conference on Intelligent Robotics and Systems (IROS'21), Prague, Czech Republic, Sept 27- Oct 1, 202

    Sensory augmentation with distal touch: The tactile helmet project

    Get PDF
    The Tactile Helmet is designed to augment a wearer's senses with a long range sense of touch. Tactile specialist animals such as rats and mice are capable of rapidly acquiring detailed information about their environment from their whiskers by using task-sensitive strategies. Providing similar information about the nearby environment, in tactile form, to a human operator could prove invaluable for search and rescue operations, or for partially-sighted people. Two key aspects of the Tactile Helmet are sensory augmentation, and active sensing. A haptic display is used to provide the user with ultrasonic range information. This can be interpreted in addition to, rather than instead of, visual or auditory information. Active sensing systems "are purposive and information-seeking sensory systems, involving task specific control of the sensory apparatus" [1]. The integration of an accelerometer allows the device to actively gate the delivery of sensory information to the user, depending on their movement. Here we describe the hardware, sensory transduction and characterisation of the Tactile Helmet device, before outlining potential use cases and benefits of the system. © 2013 Springer-Verlag Berlin Heidelberg
    • 

    corecore