4,993 research outputs found

    Virtual and Mixed Reality in Telerobotics: A Survey

    Get PDF

    MAR-CPS: Measurable Augmented Reality for Prototyping Cyber-Physical Systems

    Get PDF
    Cyber-Physical Systems (CPSs) refer to engineering platforms that rely on the inte- gration of physical systems with control, computation, and communication technologies. Autonomous vehicles are instances of CPSs that are rapidly growing with applications in many domains. Due to the integration of physical systems with computational sens- ing, planning, and learning in CPSs, hardware-in-the-loop experiments are an essential step for transitioning from simulations to real-world experiments. This paper proposes an architecture for rapid prototyping of CPSs that has been developed in the Aerospace Controls Laboratory at the Massachusetts Institute of Technology. This system, referred to as MAR-CPS (Measurable Augmented Reality for Prototyping Cyber-Physical Systems), includes physical vehicles and sensors, a motion capture technology, a projection system, and a communication network. The role of the projection system is to augment a physical laboratory space with 1) autonomous vehicles' beliefs and 2) a simulated mission environ- ment, which in turn will be measured by physical sensors on the vehicles. The main focus of this method is on rapid design of planning, perception, and learning algorithms for au- tonomous single-agent or multi-agent systems. Moreover, the proposed architecture allows researchers to project a simulated counterpart of outdoor environments in a controlled, indoor space, which can be crucial when testing in outdoor environments is disfavored due to safety, regulatory, or monetary concerns. We discuss the issues related to the design and implementation of MAR-CPS and demonstrate its real-time behavior in a variety of problems in autonomy, such as motion planning, multi-robot coordination, and learning spatio-temporal fields.Boeing Compan

    An Application of Augmented Reality (AR) in the Teaching of an Arc Welding Robot

    Get PDF
    Augmented Reality (AR) is an emerging technology that utilizes computer vision methods to overlay virtual objects onto the real world scene so as to make them appear to co-exist with the real objects. Its main objective is to enhance the user’s interaction with the real world by providing the right information needed to perform a certain task. Applications of this technology in manufacturing include maintenance, assembly and telerobotics. In this paper, we explore the potential of teaching a robot to perform an arc welding task in an AR environment. We present the motivation, features of a system using the popular ARToolkit package, and a discussion on the issues and implications of our research.Singapore-MIT Alliance (SMA

    Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation

    Full text link
    [EN] This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded byMCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant GV/2021/181).Solanes, JE.; Muñoz García, A.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Applied Sciences. 12(12):1-22. https://doi.org/10.3390/app12126071122121

    Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids

    Get PDF
    © 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through a qualitative assessment, which encourages further developments.Peer reviewe

    The augmented reality framework : an approach to the rapid creation of mixed reality environments and testing scenarios

    Get PDF
    Debugging errors during real-world testing of remote platforms can be time consuming and expensive when the remote environment is inaccessible and hazardous such as deep-sea. Pre-real world testing facilities, such as Hardware-In-the-Loop (HIL), are often not available due to the time and expense necessary to create them. Testing facilities tend to be monolithic in structure and thus inflexible making complete redesign necessary for slightly different uses. Redesign is simpler in the short term than creating the required architecture for a generic facility. This leads to expensive facilities, due to reinvention of the wheel, or worse, no testing facilities. Without adequate pre-real world testing, integration errors can go undetected until real world testing where they are more costly to diagnose and rectify, e.g. especially when developing Unmanned Underwater Vehicles (UUVs). This thesis introduces a novel framework, the Augmented Reality Framework (ARF), for rapid construction of virtual environments for Augmented Reality tasks such as Pure Simulation, HIL, Hybrid Simulation and real world testing. ARF’s architecture is based on JavaBeans and is therefore inherently generic, flexible and extendable. The aim is to increase the performance of constructing, reconfiguring and extending virtual environments, and consequently enable more mature and stable systems to be developed in less time due to previously undetectable faults being diagnosed earlier in the pre-real-world testing phase. This is only achievable if test harnesses can be created quickly and easily, which in turn allows the developer to visualise more system feedback making faults easier to spot. Early fault detection and less wasted real world testing leads to a more mature, stable and less expensive system. ARF provides guidance on how to connect and configure user made components, allowing for rapid prototyping and complex virtual environments to be created quickly and easily. In essence, ARF tries to provide intuitive construction guidance which is similar in nature to LEGOR pieces which can be so easily connected to form useful configurations. ARF is demonstrated through case studies which show the flexibility and applicability of ARF to testing techniques such as HIL for UUVs. In addition, an informal study was carried out to asses the performance increases attributable to ARF’s core concepts. In comparison to classical programming methods ARF’s average performance increase was close to 200%. The study showed that ARF was incredibly intuitive since the test subjects were novices in ARF but experts in programming. ARF provides key contributions in the field of HIL testing of remote systems by providing more accessible facilities that allow new or modified testing scenarios to be created where it might not have been feasible to do so before. In turn this leads to early detection of faults which in some cases would not have ever been detected before
    • …
    corecore