Interactive Force Control Based on Multimodal Robot Skin for Physical Human-Robot Collaboration

Abstract

This work proposes and realizes a control architecture that can support the deployment of a large-scale robot skin in a Human-Robot Collaboration scenario. It is shown, how whole-body tactile feedback can extend the capabilities of robots during dynamic interactions by providing information about multiple contacts across the robot\u27s surface. Specifically, an uncalibrated skin system is used to implement stable force control while simultaneously handling the multi-contact interactions of a user. The system formulates control tasks for force control, tactile guidance, collision avoidance, and compliance, and fuses them with a multi-priority redundancy resolution strategy. The approach is evaluated on an omnidirectional mobile-manipulator with dual arms covered with robot skin. Results are assessed under dynamic conditions, showing that multi-modal tactile information enables robust force control while at the same time remaining responsive to a user\u27s interactions

    Similar works