218 research outputs found
Large-scale environment mapping and immersive human-robot interaction for agricultural mobile robot teleoperation
Remote operation is a crucial solution to problems encountered in
agricultural machinery operations. However, traditional video streaming control
methods fall short in overcoming the challenges of single perspective views and
the inability to obtain 3D information. In light of these issues, our research
proposes a large-scale digital map reconstruction and immersive human-machine
remote control framework for agricultural scenarios. In our methodology, a DJI
unmanned aerial vehicle(UAV) was utilized for data collection, and a novel
video segmentation approach based on feature points was introduced. To tackle
texture richness variability, an enhanced Structure from Motion (SfM) using
superpixel segmentation was implemented. This method integrates the open
Multiple View Geometry (openMVG) framework along with Local Features from
Transformers (LoFTR). The enhanced SfM results in a point cloud map, which is
further processed through Multi-View Stereo (MVS) to generate a complete map
model. For control, a closed-loop system utilizing TCP for VR control and
positioning of agricultural machinery was introduced. Our system offers a fully
visual-based immersive control method, where upon connection to the local area
network, operators can utilize VR for immersive remote control. The proposed
method enhances both the robustness and convenience of the reconstruction
process, thereby significantly facilitating operators in acquiring more
comprehensive on-site information and engaging in immersive remote control
operations. The code is available at: https://github.com/LiuTao1126/Enhance-SF
Evaluating Immersive Teleoperation Interfaces: Coordinating Robot Radiation Monitoring Tasks in Nuclear Facilities
We present a virtual reality (VR) teleoperation interface for a ground-based robot, featuring dense 3D environment reconstruction and a low latency video stream, with which operators can immersively explore remote environments. At the UK Atomic Energy Authority's (UKAEA) Remote Applications in Challenging Environments (RACE) facility, we applied the interface in a user study where trained robotics operators completed simulated nuclear monitoring and decommissioning style tasks to compare VR and traditional teleoperation interface designs. We found that operators in the VR condition took longer to complete the experiment, had reduced collisions, and rated the generated 3D map with higher importance when compared to non-VR operators. Additional physiological data suggested that VR operators had a lower objective cognitive workload during the experiment but also experienced increased physical demand. Overall the presented results show that VR interfaces may benefit work patterns in teleoperation tasks within the nuclear industry, but further work is needed to investigate how such interfaces can be integrated into real world decommissioning workflows
Evaluating immersive teleoperation interfaces: coordinating robot radiation monitoring tasks in nuclear facilities
We present a virtual reality (VR) teleoperation interface for a ground-based robot, featuring dense 3D environment reconstruction and a low latency video stream, with which operators can immersively explore remote environments. At the UK Atomic Energy Authority's (UKAEA) Remote Applications in Challenging Environments (RACE) facility, we applied the interface in a user study where trained robotics operators completed simulated nuclear monitoring and decommissioning style tasks to compare VR and traditional teleoperation interface designs. We found that operators in the VR condition took longer to complete the experiment, had reduced collisions, and rated the generated 3D map with higher importance when compared to non-VR operators. Additional physiological data suggested that VR operators had a lower objective cognitive workload during the experiment but also experienced increased physical demand. Overall the presented results show that VR interfaces may benefit work patterns in teleoperation tasks within the nuclear industry, but further work is needed to investigate how such interfaces can be integrated into real world decommissioning workflows
Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids
Š 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe
Exploring Robot Teleoperation in Virtual Reality
This thesis presents research on VR-based robot teleoperation with a focus on remote environment visualisation in virtual reality, the effects of remote environment reconstruction scale in virtual reality on the human-operator's ability to control the robot and human-operator's visual attention patterns when teleoperating a robot from virtual reality.
A VR-based robot teleoperation framework was developed, it is compatible with various robotic systems and cameras, allowing for teleoperation and supervised control with any ROS-compatible robot and visualisation of the environment through any ROS-compatible RGB and RGBD cameras. The framework includes mapping, segmentation, tactile exploration, and non-physically demanding VR interface navigation and controls through any Unity-compatible VR headset and controllers or haptic devices.
Point clouds are a common way to visualise remote environments in 3D, but they often have distortions and occlusions, making it difficult to accurately represent objects' textures. This can lead to poor decision-making during teleoperation if objects are inaccurately represented in the VR reconstruction. A study using an end-effector-mounted RGBD camera with OctoMap mapping of the remote environment was conducted to explore the remote environment with fewer point cloud distortions and occlusions while using a relatively small bandwidth. Additionally, a tactile exploration study proposed a novel method for visually presenting information about objects' materials in the VR interface, to improve the operator's decision-making and address the challenges of point cloud visualisation.
Two studies have been conducted to understand the effect of virtual world dynamic scaling on teleoperation flow. The first study investigated the use of rate mode control with constant and variable mapping of the operator's joystick position to the speed (rate) of the robot's end-effector, depending on the virtual world scale. The results showed that variable mapping allowed participants to teleoperate the robot more effectively but at the cost of increased perceived workload.
The second study compared how operators used a virtual world scale in supervised control, comparing the virtual world scale of participants at the beginning and end of a 3-day experiment. The results showed that as operators got better at the task they as a group used a different virtual world scale, and participants' prior video gaming experience also affected the virtual world scale chosen by operators.
Similarly, the human-operator's visual attention study has investigated how their visual attention changes as they become better at teleoperating a robot using the framework.
The results revealed the most important objects in the VR reconstructed remote environment as indicated by operators' visual attention patterns as well as their visual priorities shifts as they got better at teleoperating the robot. The study also demonstrated that operatorsâ prior video gaming experience affects their ability to teleoperate the robot and their visual attention behaviours
Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions
In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation.
The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation.
This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering.
A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach.
Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operatorâs hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area.
Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robotâs space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts
Sensory Manipulation as a Countermeasure to Robot Teleoperation Delays: System and Evidence
In the field of robotics, robot teleoperation for remote or hazardous
environments has become increasingly vital. A major challenge is the lag
between command and action, negatively affecting operator awareness,
performance, and mental strain. Even with advanced technology, mitigating these
delays, especially in long-distance operations, remains challenging. Current
solutions largely focus on machine-based adjustments. Yet, there's a gap in
using human perceptions to improve the teleoperation experience. This paper
presents a unique method of sensory manipulation to help humans adapt to such
delays. Drawing from motor learning principles, it suggests that modifying
sensory stimuli can lessen the perception of these delays. Instead of
introducing new skills, the approach uses existing motor coordination
knowledge. The aim is to minimize the need for extensive training or complex
automation. A study with 41 participants explored the effects of altered haptic
cues in delayed teleoperations. These cues were sourced from advanced physics
engines and robot sensors. Results highlighted benefits like reduced task time
and improved perceptions of visual delays. Real-time haptic feedback
significantly contributed to reduced mental strain and increased confidence.
This research emphasizes human adaptation as a key element in robot
teleoperation, advocating for improved teleoperation efficiency via swift human
adaptation, rather than solely optimizing robots for delay adjustment.Comment: Submitted to Scientific Report
- âŚ