Mobile and adaptive User interface for human robot collaboration in assembly tasks

Abstract

The manufacturing sector is constantly looking for more efficient ways of production. The Industry 4.0 related technologies such as augmented and mixed reality, connectivity and digitalisation as well as the current trend of robotisation have resulted a number of technical solutions to support the production in factories. The combination of human-robot collaboration and augmented reality shows good promises. The challenges in this case come from the need to reconfigure the physical production layout and how to deliver the digital instructions to the operator. This paper introduces a model for collaborative assembly tasks that uses a mobile user interface based on the depth sensors and a projector. The novelty of this research comes from the adaptivity of the user interface, as it can be freely moved between the tasks around the workstation based on the operator needs and requirements of the tasks. The ability to move projection surface is achieved by detecting the surface position using Aruco markers and computing required transformation of the projector image.acceptedVersionPeer reviewe

    Similar works