2,613 research outputs found

    Visuo-Haptic Grasping of Unknown Objects through Exploration and Learning on Humanoid Robots

    Get PDF
    Die vorliegende Arbeit befasst sich mit dem Greifen unbekannter Objekte durch humanoide Roboter. Dazu werden visuelle Informationen mit haptischer Exploration kombiniert, um Greifhypothesen zu erzeugen. Basierend auf simulierten Trainingsdaten wird außerdem eine Greifmetrik gelernt, welche die Erfolgswahrscheinlichkeit der Greifhypothesen bewertet und die mit der größten geschätzten Erfolgswahrscheinlichkeit auswählt. Diese wird verwendet, um Objekte mit Hilfe einer reaktiven Kontrollstrategie zu greifen. Die zwei Kernbeiträge der Arbeit sind zum einen die haptische Exploration von unbekannten Objekten und zum anderen das Greifen von unbekannten Objekten mit Hilfe einer neuartigen datengetriebenen Greifmetrik

    Active Exploration Using Gaussian Random Fields and Gaussian Process Implicit Surfaces

    Full text link
    In this work we study the problem of exploring surfaces and building compact 3D representations of the environment surrounding a robot through active perception. We propose an online probabilistic framework that merges visual and tactile measurements using Gaussian Random Field and Gaussian Process Implicit Surfaces. The system investigates incomplete point clouds in order to find a small set of regions of interest which are then physically explored with a robotic arm equipped with tactile sensors. We show experimental results obtained using a PrimeSense camera, a Kinova Jaco2 robotic arm and Optoforce sensors on different scenarios. We then demonstrate how to use the online framework for object detection and terrain classification.Comment: 8 pages, 6 figures, external contents (https://youtu.be/0-UlFRQT0JI

    It Sounds Cool: Exploring Sonification of Mid-Air Haptic Textures Exploration on Texture Judgments, Body Perception, and Motor Behaviour

    Get PDF
    Ultrasonic mid-air haptic technology allows for the perceptual rendering of textured surfaces onto the user's hand. Unlike real textured surfaces, however, mid-air haptic feedback lacks implicit multisensory cues needed to reliably infer a texture's attributes (e.g., its roughness). In this paper, we combined mid-air haptic textures with congruent sound feedback to investigate how sonification could influence people's (1) explicit judgment of the texture attributes, (2) explicit sensations of their own hand, and (3) implicit motor behavior during haptic exploration. Our results showed that audio cues (presented solely or combined with haptics) influenced participants' judgment of the texture attributes (roughness, hardness, moisture and viscosity), produced some hand sensations (the feeling of having a hand smoother, softer, looser, more flexible, colder, wetter and more natural), and changed participants' speed (moving faster or slower) while exploring the texture. We then conducted a principal component analysis to better understand and visualize the found results and conclude with a short discussion on how audio-haptic associations can be used to create embodied experiences in emerging application scenarios in the metaverse

    Simultaneous Tactile Exploration and Grasp Refinement for Unknown Objects

    Full text link
    This paper addresses the problem of simultaneously exploring an unknown object to model its shape, using tactile sensors on robotic fingers, while also improving finger placement to optimise grasp stability. In many situations, a robot will have only a partial camera view of the near side of an observed object, for which the far side remains occluded. We show how an initial grasp attempt, based on an initial guess of the overall object shape, yields tactile glances of the far side of the object which enable the shape estimate and consequently the successive grasps to be improved. We propose a grasp exploration approach using a probabilistic representation of shape, based on Gaussian Process Implicit Surfaces. This representation enables initial partial vision data to be augmented with additional data from successive tactile glances. This is combined with a probabilistic estimate of grasp quality to refine grasp configurations. When choosing the next set of finger placements, a bi-objective optimisation method is used to mutually maximise grasp quality and improve shape representation during successive grasp attempts. Experimental results show that the proposed approach yields stable grasp configurations more efficiently than a baseline method, while also yielding improved shape estimate of the grasped object.Comment: IEEE Robotics and Automation Letters. Preprint Version. Accepted February, 202

    Embedded Object Detection and Mapping in Soft Materials Using Optical Tactile Sensing

    Full text link
    In this paper, we present a methodology that uses an optical tactile sensor for efficient tactile exploration of embedded objects within soft materials. The methodology consists of an exploration phase, where a probabilistic estimate of the location of the embedded objects is built using a Bayesian approach. The exploration phase is then followed by a mapping phase which exploits the probabilistic map to reconstruct the underlying topography of the workspace by sampling in more detail regions where there is expected to be embedded objects. To demonstrate the effectiveness of the method, we tested our approach on an experimental setup that consists of a series of quartz beads located underneath a polyethylene foam that prevents direct observation of the configuration and requires the use of tactile exploration to recover the location of the beads. We show the performance of our methodology using ten different configurations of the beads where the proposed approach is able to approximate the underlying configuration. We benchmark our results against a random sampling policy

    "What was Molyneux's Question A Question About?"

    Get PDF
    Molyneux asked whether a newly sighted person could distinguish a sphere from a cube by sight alone, given that she was antecedently able to do so by touch. This, we contend, is a question about general ideas. To answer it, we must ask (a) whether spatial locations identified by touch can be identified also by sight, and (b) whether the integration of spatial locations into an idea of shape persists through changes of modality. Posed this way, Molyneux’s Question goes substantially beyond question (a), about spatial locations, alone; for a positive answer to (a) leaves open whether a perceiver might cross-identify locations, but not be able to identify the shapes that collections of locations comprise. We further emphasize that MQ targets general ideas so as to distinguish it from corresponding questions about experiences of shape and about the property of tangible (vs. visual) shape. After proposing a generalized formulation of MQ, we extend earlier work (“Many Molyneux Questions,” Australasian Journal of Philosophy 2020) by showing that MQ does not admit a single answer across the board. Some integrative data-processes transfer across modalities; others do not. Seeing where and how such transfer succeeds and fails in individual cases has much to offer to our understanding of perception and its modalities
    corecore