CORE
🇺🇦
make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids
Authors
Alessandro Caniglia
Luciano Cantelli
+7 more
Dario C. Guastella
Salvatore Livatino
Riccardo Mazza
Carmelo D. Melita
Giovanni Muscato
Gianluca Padula
Vincenzo Rinaldi
Publication date
8 February 2021
Publisher
'Institute of Electrical and Electronics Engineers (IEEE)'
Doi
Cite
Abstract
© 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe
Similar works
Full text
Open in the Core reader
Download PDF
Available Versions
University of Hertfordshire Research Archive
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:uhra.herts.ac.uk:2299/2390...
Last time updated on 20/02/2021
University of Dundee Online Publications
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:discovery.dundee.ac.uk:ope...
Last time updated on 18/03/2023
University of Hertfordshire Research Archive
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:uhra.herts.ac.uk:2299/2389...
Last time updated on 20/02/2021
University of Dundee Online Publications
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:discovery.dundee.ac.uk:pub...
Last time updated on 22/03/2021
University of Hertfordshire Research Archive
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:uhra.herts.ac.uk:2299/2390...
Last time updated on 20/02/2021
University of Hertfordshire Research Archive
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:uhra.herts.ac.uk:2299/2389...
Last time updated on 20/02/2021