706 research outputs found

    Pose-Based Tactile Servoing: Controlled Soft Touch using Deep Learning

    Full text link
    This article describes a new way of controlling robots using soft tactile sensors: pose-based tactile servo (PBTS) control. The basic idea is to embed a tactile perception model for estimating the sensor pose within a servo control loop that is applied to local object features such as edges and surfaces. PBTS control is implemented with a soft curved optical tactile sensor (the BRL TacTip) using a convolutional neural network trained to be insensitive to shear. In consequence, robust and accurate controlled motion over various complex 3D objects is attained. First, we review tactile servoing and its relation to visual servoing, before formalising PBTS control. Then, we assess tactile servoing over a range of regular and irregular objects. Finally, we reflect on the relation to visual servo control and discuss how controlled soft touch gives a route towards human-like dexterity in robots.Comment: A summary video is available here https://youtu.be/12-DJeRcfn0 *NL and JL contributed equally to this wor

    Contact geometry and mechanics predict friction forces during tactile surface exploration

    Get PDF
    International audienceWhen we touch an object, complex frictional forces are produced, aiding us in perceiving surface features that help to identify the object at hand, and also facilitating grasping and manipulation. However, even during controlled tactile exploration, sliding friction forces fluctuate greatly, and it is unclear how they relate to the surface topography or mechanics of contact with the finger. We investigated the sliding contact between the finger and different relief surfaces, using high-speed video and force measurements. Informed by these experiments, we developed a friction force model that accounts for surface shape and contact mechanical effects, and is able to predict sliding friction forces for different surfaces and exploration speeds. We also observed that local regions of disconnection between the finger and surface develop near high relief features, due to the stiffness of the finger tissues. Every tested surface had regions that were never contacted by the finger; we refer to these as " tactile blind spots ". The results elucidate friction force production during tactile exploration, may aid efforts to connect sensory and motor function of the hand to properties of touched objects, and provide crucial knowledge to inform the rendering of realistic experiences of touch contact in virtual reality

    A comparison of learning with haptic and visual modalities.

    Get PDF
    The impact of haptic feedback on the perception of unknown objects (10 without texture, 10 with texture, and 2 complex shapes) was examined. Using a point probe (a PHANTOM), three treatment groups of students (visual, haptic, and visual plus haptic feedback) explored a set of virtual objects. The visual treatment group observed the objects through a small circular aperture. Accuracy of perception, exploration time, and description of objects were compared for the three treatment groups. Participants included 45 visually normal undergraduate students distributed across the three treatment groups and 4 blind students composing a second hapticonly group. Results showed that, within the normally sighted students, the haptic and haptic plus visual groups were slightly slower in their explorations than the visual group. The haptic plus visual group was more accurate in identifying objects than the visual or haptic-only groups. The terms used by the haptic treatment group to describe the objects differed from the visual and visual plus haptic groups, suggesting that these modalities are processed differently. There were no differences across the three groups for long-term memory of the objects. The haptic group was significantly more accurate in identifying the complex objects than the visual or visual plus haptic groups. The blind students using haptic feedback were not significantly different from the other haptic-only treatment group of normally-sighted participants for accuracy, exploration pathways, and exploration times. The haptic-only group of participants spent more time exploring the back half of the virtual objects than the visual or visual plus haptic participants. This finding supports previous research showing that the use of the PHANTOM with haptic feedback tends to support the development of 3-dimensional understandings of objects

    Determining object geometry with compliance and simple sensors

    Full text link

    Automation of train cab front cleaning with a robot manipulator

    Get PDF
    In this letter we present a control and trajectory tracking approach for wiping the train cab front panels, using a velocity controlled robotic manipulator and a force/torque sensor attached to its end effector, without using any surface model or vision-based surface detection. The control strategy consists in a simultaneous position and force controller, adapted from the operational space formulation, that aligns the cleaning tool with the surface normal, maintaining a set-point normal force, while simultaneously moving along the surface. The trajectory tracking strategy consists in specifying and tracking a two dimensional path that, when projected onto the train surface, corresponds to the desired pattern of motion. We first validated our approach using the Baxter robot to wipe a highly curved surface with both a spiral and a raster scan motion patterns. Finally, we implemented the same approach in a scaled robot prototype, specifically designed by ourselves to wipe a 1/8 scaled version of a train cab front, using a raster scan pattern

    Biomimetic Active Touch with Fingertips and Whiskers

    Get PDF

    Robotic manipulators for single access surgery

    Get PDF
    This thesis explores the development of cooperative robotic manipulators for enhancing surgical precision and patient outcomes in single-access surgery and, specifically, Transanal Endoscopic Microsurgery (TEM). During these procedures, surgeons manipulate a heavy set of instruments via a mechanical clamp inserted in the patient’s body through a surgical port, resulting in imprecise movements, increased patient risks, and increased operating time. Therefore, an articulated robotic manipulator with passive joints is initially introduced, featuring built-in position and force sensors in each joint and electronic joint brakes for instant lock/release capability. The articulated manipulator concept is further improved with motorised joints, evolving into an active tool holder. The joints allow the incorporation of advanced robotic capabilities such as ultra-lightweight gravity compensation and hands-on kinematic reconfiguration, which can optimise the placement of the tool holder in the operating theatre. Due to the enhanced sensing capabilities, the application of the active robotic manipulator was further explored in conjunction with advanced image guidance approaches such as endomicroscopy. Recent advances in probe-based optical imaging such as confocal endomicroscopy is making inroads in clinical uses. However, the challenging manipulation of imaging probes hinders their practical adoption. Therefore, a combination of the fully cooperative robotic manipulator with a high-speed scanning endomicroscopy instrument is presented, simplifying the incorporation of optical biopsy techniques in routine surgical workflows. Finally, another embodiment of a cooperative robotic manipulator is presented as an input interface to control a highly-articulated robotic instrument for TEM. This master-slave interface alleviates the drawbacks of traditional master-slave devices, e.g., using clutching mechanics to compensate for the mismatch between slave and master workspaces, and the lack of intuitive manipulation feedback, e.g. joint limits, to the user. To address those drawbacks a joint-space robotic manipulator is proposed emulating the kinematic structure of the flexible robotic instrument under control.Open Acces

    Cognitive Reasoning for Compliant Robot Manipulation

    Get PDF
    Physically compliant contact is a major element for many tasks in everyday environments. A universal service robot that is utilized to collect leaves in a park, polish a workpiece, or clean solar panels requires the cognition and manipulation capabilities to facilitate such compliant interaction. Evolution equipped humans with advanced mental abilities to envision physical contact situations and their resulting outcome, dexterous motor skills to perform the actions accordingly, as well as a sense of quality to rate the outcome of the task. In order to achieve human-like performance, a robot must provide the necessary methods to represent, plan, execute, and interpret compliant manipulation tasks. This dissertation covers those four steps of reasoning in the concept of intelligent physical compliance. The contributions advance the capabilities of service robots by combining artificial intelligence reasoning methods and control strategies for compliant manipulation. A classification of manipulation tasks is conducted to identify the central research questions of the addressed topic. Novel representations are derived to describe the properties of physical interaction. Special attention is given to wiping tasks which are predominant in everyday environments. It is investigated how symbolic task descriptions can be translated into meaningful robot commands. A particle distribution model is used to plan goal-oriented wiping actions and predict the quality according to the anticipated result. The planned tool motions are converted into the joint space of the humanoid robot Rollin' Justin to perform the tasks in the real world. In order to execute the motions in a physically compliant fashion, a hierarchical whole-body impedance controller is integrated into the framework. The controller is automatically parameterized with respect to the requirements of the particular task. Haptic feedback is utilized to infer contact and interpret the performance semantically. Finally, the robot is able to compensate for possible disturbances as it plans additional recovery motions while effectively closing the cognitive control loop. Among others, the developed concept is applied in an actual space robotics mission, in which an astronaut aboard the International Space Station (ISS) commands Rollin' Justin to maintain a Martian solar panel farm in a mock-up environment. This application demonstrates the far-reaching impact of the proposed approach and the associated opportunities that emerge with the availability of cognition-enabled service robots
    corecore