512 research outputs found

    Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation

    Full text link
    [EN] This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded byMCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant GV/2021/181).Solanes, JE.; Muñoz García, A.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Applied Sciences. 12(12):1-22. https://doi.org/10.3390/app12126071122121

    Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids

    Get PDF
    © 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe

    Virtual reality based multi-modal teleoperation using mixed autonomy

    Get PDF
    The thesis presents a multi modal teleoperation interface featuring an integrated virtual reality based simulation aumented by sensors and image processing capabilities onboard the remotely operated vehicle. The virtual reality interface fuses an existing VR model with live video feed and prediction states, thereby creating a multi modal control interface. Virtual reality addresses the typical limitations of video-based teleoperation caused by signal lag and limited field of view thereby allowing the operator to navigate in a continuous fashion. The vehicle incorporates an on-board computer and a stereo vision system to facilitate obstacle detection. A vehicle adaptation system with a priori risk maps and real state tracking system enables temporary autonomous operation of the vehicle for local navigation around obstacles and automatic re-establishment of the vehicle\u27s teleoperated state. As both the vehicle and the operator share absolute autonomy in stages, the operation is referred to as mixed autonomous. Finally, the system provides real time update of the virtual environment based on anomalies encountered by the vehicle. The system effectively balances the autonomy between the human operator and on board vehicle intelligence. The reliability results of individual components along with overall system implementation and the results of the user study helps show that the VR based multi modal teleoperation interface is more adaptable and intuitive when compared to other interfaces

    Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots

    Get PDF
    The need for intervention in underwater environments has increased in recent years but there is still a long way to go before AUVs (Autonomous Underwater Vehicleswill be able to cope with really challenging missions. Nowadays, the solution adopted is mainly based on remote operated vehicle (ROV) technology. These ROVs are controlled from support vessels by using unnecessarily complex human–robot interfaces (HRI). Therefore, it is necessary to reduce the complexity of these systems to make them easier to use and to reduce the stress on the operator. In this paper, and as part of the TWIN roBOTs for the cooperative underwater intervention missions (TWINBOT) project, we present an HRI (Human-Robot Interface) module which includes virtual reality (VR) technology. In fact, this contribution is an improvement on a preliminary study in this field also carried out, by our laboratory. Hence, having made a concerted effort to improve usability, the HRI system designed for robot control tasks presented in this paper is substantially easier to use. In summary, reliability and feasibility of this HRI module have been demonstrated thanks to the usability tests, which include a very complete pilot study, and guarantee much more friendly and intuitive properties in the final HRI-developed module presented here

    Performance and Usability Evaluation Scheme for Mobile Manipulator Teleoperation

    Get PDF
    This article presents a standardized human–robot teleoperation interface (HRTI) evaluation scheme for mobile manipulators. Teleoperation remains the predominant control type for mobile manipulators in open environments, particularly for quadruped manipulators. However, mobile manipulators, especially quadruped manipulators, are relatively novel systems to be implemented in the industry compared to traditional machinery. Consequently, no standardized interface evaluation method has been established for them. The proposed scheme is the first of its kind in evaluating mobile manipulator teleoperation. It comprises a set of robot motion tests, objective measures, subjective measures, and a prediction model to provide a comprehensive evaluation. The motion tests encompass locomotion, manipulation, and a combined test. The duration for each trial is collected as the response variable in the objective measure. Statistical tools, including mean value, standard deviation, and T-test, are utilized to cross-compare between different predictor variables. Based on an extended Fitts' law, the prediction model employs the time and mission difficulty index to forecast system performance in future missions. The subjective measures utilize the NASA-task load index and the system usability scale to assess workload and usability. Finally, the proposed scheme is implemented on a real-world quadruped manipulator with two widely-used HRTIs, the gamepad and the wearable motion capture system
    • …
    corecore