3,207 research outputs found

    Feeling the Shape: Active Exploration Behaviors for Object Recognition With a Robotic Hand

    Get PDF
    Autonomous exploration in robotics is a crucial feature to achieve robust and safe systems capable to interact with and recognize their surrounding environment. In this paper, we present a method for object recognition using a three-fingered robotic hand actively exploring interesting object locations to reduce uncertainty. We present a novel probabilistic perception approach with a Bayesian formulation to iteratively accumulate evidence from robot touch. Exploration of better locations for perception is performed by familiarity and novelty exploration behaviors, which intelligently control the robot hand to move toward locations with low and high levels of interestingness, respectively. These are active behaviors that, similar to the exploratory procedures observed in humans, allow robots to autonomously explore locations they believe that contain interesting information for recognition. Active behaviors are validated with object recognition experiments in both offline and real-time modes. Furthermore, the effects of inhibiting the active behaviors are analyzed with a passive exploration strategy. The results from the experiments demonstrate the accuracy of our proposed methods, but also their benefits for active robot control to intelligently explore and interact with the environment

    Efficient Bayesian Exploration for Soft Morphology-Action Co-optimization

    Get PDF
    UK Agriculture and Horticulture Development Board(Project CP 172)AHD

    Active Control for Object Perception and Exploration with a Robotic Hand

    Get PDF
    We present an investigation on active control for intelligent object exploration using touch with a robotic hand. First, uncertainty from the exploration is reduced by a probabilistic method based on the accumulation of evidence through the interaction with an object of interest. Second, an intrinsic motivation approach allows the robot hand to perform intelligent active control of movements to explore interesting locations of the object. Passive and active perception and exploration were implemented in simulated and real environments to compare their benefits in accuracy and reaction time. The validation of the proposed method were performed with an object recognition task, using a robotic platform composed by a three-fingered robotic hand and a robot table. The results demonstrate that our method permits the robotic hand to achieve high accuracy for object recognition with low impact on the reaction time required to perform the task. These benefits make our method suitable for perception and exploration in autonomous robotics

    Active Clothing Material Perception using Tactile Sensing and Deep Learning

    Full text link
    Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte

    On Neuromechanical Approaches for the Study of Biological Grasp and Manipulation

    Full text link
    Biological and robotic grasp and manipulation are undeniably similar at the level of mechanical task performance. However, their underlying fundamental biological vs. engineering mechanisms are, by definition, dramatically different and can even be antithetical. Even our approach to each is diametrically opposite: inductive science for the study of biological systems vs. engineering synthesis for the design and construction of robotic systems. The past 20 years have seen several conceptual advances in both fields and the quest to unify them. Chief among them is the reluctant recognition that their underlying fundamental mechanisms may actually share limited common ground, while exhibiting many fundamental differences. This recognition is particularly liberating because it allows us to resolve and move beyond multiple paradoxes and contradictions that arose from the initial reasonable assumption of a large common ground. Here, we begin by introducing the perspective of neuromechanics, which emphasizes that real-world behavior emerges from the intimate interactions among the physical structure of the system, the mechanical requirements of a task, the feasible neural control actions to produce it, and the ability of the neuromuscular system to adapt through interactions with the environment. This allows us to articulate a succinct overview of a few salient conceptual paradoxes and contradictions regarding under-determined vs. over-determined mechanics, under- vs. over-actuated control, prescribed vs. emergent function, learning vs. implementation vs. adaptation, prescriptive vs. descriptive synergies, and optimal vs. habitual performance. We conclude by presenting open questions and suggesting directions for future research. We hope this frank assessment of the state-of-the-art will encourage and guide these communities to continue to interact and make progress in these important areas

    Design of a 3D-printed soft robotic hand with distributed tactile sensing for multi-grasp object identification

    Get PDF
    Tactile object identification is essential in environments where vision is occluded or when intrinsic object properties such as weight or stiffness need to be discriminated between. The robotic approach to this task has traditionally been to use rigid-bodied robots equipped with complex control schemes to explore different objects. However, whilst varying degrees of success have been demonstrated, these approaches are limited in their generalisability due to the complexity of the control schemes required to facilitate safe interactions with diverse objects. In this regard, Soft Robotics has garnered increased attention in the past decade due to the ability to exploit Morphological Computation through the agent's body to simplify the task by conforming naturally to the geometry of objects being explored. This exists as a paradigm shift in the design of robots since Soft Robotics seeks to take inspiration from biological solutions and embody adaptability in order to interact with the environment rather than relying on centralised computation. In this thesis, we formulate, simplify, and solve an object identification task using Soft Robotic principles. We design an anthropomorphic hand that has human-like range of motion and compliance in the actuation and sensing. The range of motion is validated through the Feix GRASP taxonomy and the Kapandji Thumb Opposition test. The hand is monolithically fabricated using multi-material 3D printing to enable the exploitation of different material properties within the same body and limit variability between samples. The hand's compliance facilitates adaptable grasping of a wide range of objects and features integrated distributed tactile sensing. We emulate the human approach of integrating information from multiple contacts and grasps of objects to discriminate between them. Two bespoke neural networks are designed to extract patterns from both the tactile data and the relationships between grasps to facilitate high classification accuracy

    Active haptic shape recognition by intrinsic motivation with a robot hand

    Get PDF
    In this paper, we present an intrinsic motivation approach applied to haptics in robotics for tactile object exploration and recognition. Here, touch is used as the sensation process for contact detection, whilst proprioceptive information is used for the perception process. First, a probabilistic method is employed to reduce uncertainty present in tactile measurements. Second, the object exploration process is actively controlled by intelligently moving the robot hand towards interesting locations. The active behaviour performed with the robotic hand is achieved by an intrinsic motivation approach, which permitted to improve the accuracy for object recognition over the results obtained by a fixed sequence of exploration movements. The proposed method was validated in a simulated environment with a Monte Carlo method, whilst for the real environment a three-fingered robotic hand and various object shapes were employed. The results demonstrate that our method is robust and suitable for haptic perception in autonomous robotics

    Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

    Get PDF
    Humans use information from sensory predictions, together with current observations, for the optimal exploration and recognition of their surrounding environment. In this work, two novel adaptive perception strategies are proposed for accurate and fast exploration of object shape with a robotic tactile sensor. These strategies called (1) adaptive weighted prior and (2) adaptive weighted posterior, combine tactile sensory predictions and current sensor observations to autonomously adapt the accuracy and speed of active Bayesian perception in object exploration tasks. Sensory predictions, obtained from a forward model, use a novel Predicted Information Gain method. These predictions are used by the tactile robot to analyse ‘what would have happened’ if certain decisions ‘would have been made’ at previous decision times. The accuracy of predictions is evaluated and controlled by a confidence parameter, to ensure that the adaptive perception strategies rely more on predictions when they are accurate, and more on current sensory observations otherwise. This work is systematically validated with the recognition of angle and position data extracted from the exploration of object shape, using a biomimetic tactile sensor and a robotic platform. The exploration task implements the contour following procedure used by humans to extract object shape with the sense of touch. The validation process is performed with the adaptive weighted strategies and active perception alone. The adaptive approach achieved higher angle accuracy (2.8 deg) over active perception (5 deg). The position accuracy was similar for all perception methods (0.18 mm). The reaction time or number of tactile contacts, needed by the tactile robot to make a decision, was improved by the adaptive perception (1 tap) over active perception (5 taps). The results show that the adaptive perception strategies can enable future robots to adapt their performance, while improving the trade-off between accuracy and reaction time, for tactile exploration, interaction and recognition tasks

    Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

    Get PDF
    Humans use information from sensory predictions, together withcurrent observations, for the optimal exploration and recognition oftheir surrounding environment. In this work, two novel adaptiveperception strategies are proposed for accurate and fast exploration ofobject shape with a robotic tactile sensor. These strategies called 1)adaptive weighted prior and 2) adaptive weighted posterior, combinetactile sensory predictions and current sensor observations toautonomously adapt the accuracy and speed of active Bayesian perceptionin object exploration tasks. Sensory predictions, obtained from a forwardmodel, use a novel Predicted Information Gain method. These predictionsare used by the tactile robot to analyse `what would have happened' ifcertain decisions `would have been made' at previous decision times. Theaccuracy of predictions is evaluated and controlled by a confidenceparameter, to ensure that the adaptive perception strategies rely more onpredictions when they are accurate, and more on current sensoryobservations otherwise. This work is systematically validated with therecognition of angle and position data extracted from the exploration ofobject shape, using a biomimetic tactile sensor and a robotic platform.The exploration task implements the contour following procedure used byhumans to extract object shape with the sense of touch. The validationprocess is performed with the adaptive weighted strategies and activeperception alone. The adaptive approach achieved higher angle accuracy(2.8 deg) over active perception (5 deg). The position accuracy wassimilar for all perception methods (0.18 mm). The reaction time or numberof tactile contacts, needed by the tactile robot to make a decision, wasimproved by the adaptive perception (1 tap) over active perception (5taps). The results show that the adaptive perception strategies canenable future robots to adapt their performance, while improving thetrade-off between accuracy and reaction time, for tactile exploration,interaction and recognition tasks
    • …
    corecore