148 research outputs found

    Digital Cognitive Companions for Marine Vessels : On the Path Towards Autonomous Ships

    Get PDF
    As for the automotive industry, industry and academia are making extensive efforts to create autonomous ships. The solutions for this are very technology-intense. Many building blocks, often relying on AI technology, need to work together to create a complete system that is safe and reliable to use. Even when the ships are fully unmanned, humans are still foreseen to guide the ships when unknown situations arise. This will be done through teleoperation systems.In this thesis, methods are presented to enhance the capability of two building blocks that are important for autonomous ships; a positioning system, and a system for teleoperation.The positioning system has been constructed to not rely on the Global Positioning System (GPS), as this system can be jammed or spoofed. Instead, it uses Bayesian calculations to compare the bottom depth and magnetic field measurements with known sea charts and magnetic field maps, in order to estimate the position. State-of-the-art techniques for this method typically use high-resolution maps. The problem is that there are hardly any high-resolution terrain maps available in the world. Hence we present a method using standard sea-charts. We compensate for the lower accuracy by using other domains, such as magnetic field intensity and bearings to landmarks. Using data from a field trial, we showed that the fusion method using multiple domains was more robust than using only one domain. In the second building block, we first investigated how 3D and VR approaches could support the remote operation of unmanned ships with a data connection with low throughput, by comparing respective graphical user interfaces (GUI) with a Baseline GUI following the currently applied interfaces in such contexts. Our findings show that both the 3D and VR approaches outperform the traditional approach significantly. We found the 3D GUI and VR GUI users to be better at reacting to potentially dangerous situations than the Baseline GUI users, and they could keep track of the surroundings more accurately. Building from this, we conducted a teleoperation user study using real-world data from a field-trial in the archipelago, where the users should assist the positioning system with bearings to landmarks. The users experienced the tool to give a good overview, and despite the connection with the low throughput, they managed through the GUI to significantly improve the positioning accuracy

    Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions

    Get PDF
    In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation. The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation. This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering. A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach. Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robot’s space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts
    • …
    corecore