310 research outputs found

    Vehicle Teleoperation Interfaces

    Get PDF

    Virtual and Mixed Reality in Telerobotics: A Survey

    Get PDF

    Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

    Get PDF
    This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics

    Comparison of Human Pilot (Remote) Control Systems in Multirotor Unmanned Aerial Vehicle Navigation

    Get PDF
    This paper concerns about the human pilot or remote control system in UAV navigation. Demands for Unmanned Aerial Vehicle (UAV) are increasing tremendously in aviation industry and research area. UAV is a flying machine that can fly with no pilot onboard and can be controlled by ground-based operators. In this paper, a comparison was made between different proposed remote control systems and devices to navigate multirotor UAV, like hand-controllers, gestures and body postures techniques, and vision-based techniques. The overall reviews discussed in this paper have been studied in various research sources related to UAV and its navigation system. Every method has its pros and cons depends on the situation. At the end of the study, those methods will be analyzed and the best method will be chosen in term of accuracy and efficiency

    From teleoperation to the cognitive human-robot interface

    Get PDF
    Robots are slowly moving from factories to mines, construction sites, public places and homes. This new type of robot or robotized working machine – field and service robots (FSR) – should be capable of performing different kinds of tasks in unstructured changing environments, not only among humans but through continuous interaction with humans. The main requirements for an FSR are mobility, advanced perception capabilities, high "intelligence" and easy interaction with humans. Although mobility and perception capabilities are no longer bottlenecks, they can nevertheless still be greatly improved. The main bottlenecks are intelligence and the human - robot interface (HRI). Despite huge efforts in "artificial intelligence" research, the robots and computers are still very "stupid" and there are no major advancements on the horizon. This emphasizes the importance of the HRI. In the subtasks, where high-level cognition or intelligence is needed, the robot has to ask for help from the operator. In addition to task commands and supervision, the HRI has to provide the possibility of exchanging information about the task and environment through continuous dialogue and even methods for direct teleoperation. The thesis describes the development from teleoperation to service robot interfaces and analyses the usability aspects of both teleoperation/telepresence systems and robot interfaces based on high-level cognitive interaction. The analogue in the development of teleoperation interfaces and HRIs is also pointed out. The teleoperation and telepresence interfaces are studied on the basis of a set of experiments in which the different enhancement-level telepresence systems were tested in different tasks of a driving type. The study is concluded by comparing the usability aspects and the feeling of presence in a telepresence system. HRIs are studied with an experimental service robot WorkPartner. Different kinds of direct teleoperation, dialogue and spatial information interfaces are presented and tested. The concepts of cognitive interface and common presence are presented. Finally, the usability aspects of a human service robot interface are discussed and evaluated.reviewe

    Large-scale environment mapping and immersive human-robot interaction for agricultural mobile robot teleoperation

    Full text link
    Remote operation is a crucial solution to problems encountered in agricultural machinery operations. However, traditional video streaming control methods fall short in overcoming the challenges of single perspective views and the inability to obtain 3D information. In light of these issues, our research proposes a large-scale digital map reconstruction and immersive human-machine remote control framework for agricultural scenarios. In our methodology, a DJI unmanned aerial vehicle(UAV) was utilized for data collection, and a novel video segmentation approach based on feature points was introduced. To tackle texture richness variability, an enhanced Structure from Motion (SfM) using superpixel segmentation was implemented. This method integrates the open Multiple View Geometry (openMVG) framework along with Local Features from Transformers (LoFTR). The enhanced SfM results in a point cloud map, which is further processed through Multi-View Stereo (MVS) to generate a complete map model. For control, a closed-loop system utilizing TCP for VR control and positioning of agricultural machinery was introduced. Our system offers a fully visual-based immersive control method, where upon connection to the local area network, operators can utilize VR for immersive remote control. The proposed method enhances both the robustness and convenience of the reconstruction process, thereby significantly facilitating operators in acquiring more comprehensive on-site information and engaging in immersive remote control operations. The code is available at: https://github.com/LiuTao1126/Enhance-SF
    • …
    corecore