thesis

Contact aware robust semi-autonomous teleoperation of mobile manipulators

Abstract

In the context of human-robot collaboration, cooperation and teaming, the use of mobile manipulators is widespread on applications involving unpredictable or hazardous environments for humans operators, like space operations, waste management and search and rescue on disaster scenarios. Applications where the manipulator's motion is controlled remotely by specialized operators. Teleoperation of manipulators is not a straightforward task, and in many practical cases represent a common source of failures. Common issues during the remote control of manipulators are: increasing control complexity with respect the mechanical degrees of freedom; inadequate or incomplete feedback to the user (i.e. limited visualization or knowledge of the environment); predefined motion directives may be incompatible with constraints or obstacles imposed by the environment. In the latter case, part of the manipulator may get trapped or blocked by some obstacle in the environment, failure that cannot be easily detected, isolated nor counteracted remotely. While control complexity can be reduced by the introduction of motion directives or by abstraction of the robot motion, the real-time constraint of the teleoperation task requires the transfer of the least possible amount of data over the system's network, thus limiting the number of physical sensors that can be used to model the environment. Therefore, it is of fundamental to define alternative perceptive strategies to accurately characterize different interaction with the environment without relying on specific sensory technologies. In this work, we present a novel approach for safe teleoperation, that takes advantage of model based proprioceptive measurement of the robot dynamics to robustly identify unexpected collisions or contact events with the environment. Each identified collision is translated on-the-fly into a set of local motion constraints, allowing the exploitation of the system redundancies for the computation of intelligent control laws for automatic reaction, without requiring human intervention and minimizing the disturbance of the task execution (or, equivalently, the operator efforts). More precisely, the described system consist in two different building blocks. The first, for detecting unexpected interactions with the environment (perceptive block). The second, for intelligent and autonomous reaction after the stimulus (control block). The perceptive block is responsible of the contact event identification. In short, the approach is based on the claim that a sensorless collision detection method for robot manipulators can be extended to the field of mobile manipulators, by embedding it within a statistical learning framework. The control deals with the intelligent and autonomous reaction after the contact or impact with the environment occurs, and consist on an motion abstraction controller with a prioritized set of constrains, where the highest priority correspond to the robot reconfiguration after a collision is detected; when all related dynamical effects have been compensated, the controller switch again to the basic control mode

    Similar works