985 research outputs found

    Teleoperated Service Robot with an Immersive Mixed Reality Interface

    Get PDF
    Teleoperated service robots can perform more complex and precise tasks as they combine robot skills and human expertise. Communication between the operator and the robot is essential for remote operation and strongly affects system efficiency. Immersive interfaces are being used to enhance teleoperation experience. However, latency or time delay can impair the performance of the robot operation. Since remote visualization involves transmitting a large amount of video data, the challenge is to decrease communication instability. Then, an efficient teleoperation system must have a suitable operation interface capable of visualizing the remote environment, controlling the robot, and having a fast response time. This work presents the development of a service robot teleoperation system with an immersive mixed reality operation interface where the operator can visualize the real remote environment or a virtual 3D environment representing it. The virtual environment aims to reduce the latency on communication by reducing the amount of information sent over the network and improve user experience. The robot can perform navigation and simple tasks autonomously or change to the teleoperated mode for more complex tasks. The system was developed using ROS, UNITY 3D, and sockets to be exported with ease to different platforms. The experiments suggest that having an immersive operation interface provides improved usability for the operator. The latency appears to improve when using the virtual environment. The user experience seems to benefit from the use of mixed reality techniques; this may lead to the broader use of teleoperated service robot systems

    A natural interface for remote operation of underwater robots

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision-making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. We conducted an experiment and found that the highest user preference and performance was in the immersive condition with joystick navigation.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)

    Design and evaluation of a natural interface for remote operation of underwater roter

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. Finally, some experiments have been carried out to compare a traditional setup and the new procedure, demonstrating reliability and feasibility of our approach.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)

    Evaluating Immersive Teleoperation Interfaces: Coordinating Robot Radiation Monitoring Tasks in Nuclear Facilities

    Get PDF
    We present a virtual reality (VR) teleoperation interface for a ground-based robot, featuring dense 3D environment reconstruction and a low latency video stream, with which operators can immersively explore remote environments. At the UK Atomic Energy Authority's (UKAEA) Remote Applications in Challenging Environments (RACE) facility, we applied the interface in a user study where trained robotics operators completed simulated nuclear monitoring and decommissioning style tasks to compare VR and traditional teleoperation interface designs. We found that operators in the VR condition took longer to complete the experiment, had reduced collisions, and rated the generated 3D map with higher importance when compared to non-VR operators. Additional physiological data suggested that VR operators had a lower objective cognitive workload during the experiment but also experienced increased physical demand. Overall the presented results show that VR interfaces may benefit work patterns in teleoperation tasks within the nuclear industry, but further work is needed to investigate how such interfaces can be integrated into real world decommissioning workflows

    Evaluating immersive teleoperation interfaces: coordinating robot radiation monitoring tasks in nuclear facilities

    Get PDF
    We present a virtual reality (VR) teleoperation interface for a ground-based robot, featuring dense 3D environment reconstruction and a low latency video stream, with which operators can immersively explore remote environments. At the UK Atomic Energy Authority's (UKAEA) Remote Applications in Challenging Environments (RACE) facility, we applied the interface in a user study where trained robotics operators completed simulated nuclear monitoring and decommissioning style tasks to compare VR and traditional teleoperation interface designs. We found that operators in the VR condition took longer to complete the experiment, had reduced collisions, and rated the generated 3D map with higher importance when compared to non-VR operators. Additional physiological data suggested that VR operators had a lower objective cognitive workload during the experiment but also experienced increased physical demand. Overall the presented results show that VR interfaces may benefit work patterns in teleoperation tasks within the nuclear industry, but further work is needed to investigate how such interfaces can be integrated into real world decommissioning workflows

    Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids

    Get PDF
    © 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe
    • …
    corecore