933 research outputs found

    A Novel Framework for Mixed Reality–Based Control of Collaborative Robot: Development Study

    Get PDF
    Background: Applications of robotics in daily life are becoming essential by creating new possibilities in different fields, especially in the collaborative environment. The potentials of collaborative robots are tremendous as they can work in the same workspace as humans. A framework employing a top-notch technology for collaborative robots will surely be worthwhile for further research. Objective: This study aims to present the development of a novel framework for the collaborative robot using mixed reality. Methods: The framework uses Unity and Unity Hub as a cross-platform gaming engine and project management tool to design the mixed reality interface and digital twin. It also uses the Windows Mixed Reality platform to show digital materials on holographic display and the Azure mixed reality services to capture and expose digital information. Eventually, it uses a holographic device (HoloLens 2) to execute the mixed reality–based collaborative system. Results: A thorough experiment was conducted to validate the novel framework for mixed reality–based control of a collaborative robot. This framework was successfully applied to implement a collaborative system using a 5–degree of freedom robot (xArm-5) in a mixed reality environment. The framework was stable and worked smoothly throughout the collaborative session. Due to the distributed nature of cloud applications, there is a negligible latency between giving a command and the execution of the physical collaborative robot. Conclusions: Opportunities for collaborative robots in telerehabilitation and teleoperation are vital as in any other field. The proposed framework was successfully applied in a collaborative session, and it can also be applied in other similar potential applications for robust and more promising performance

    Debugging Quadrocopter Trajectories in Mixed Reality

    Get PDF
    Debugging and monitoring robotic applications is a very intricate and error-prone task. To this end, we propose a mixed-reality approach to facilitate this process along a concrete scenario. We connected the Microsoft HoloLens smart glass to the Robot Operating System (ROS), which is used to control robots, and visualize arbitrary flight data of a quadrocopter. Hereby, we display holograms correctly in the real world based on a conversion of the internal tracking coordinates into coordinates provided by a motion capturing system. Moreover, we describe the synchronization process of the internal tracking with the motion capturing. Altogether, the combination of the HoloLens and the external tracking system shows promising preliminary results. Moreover, our approach can be extended to directly manipulate source code through its mixed-reality visualization and offers new interaction methods to debug and develop robotic applications

    Mixed reality enhanced user interactive path planning for omnidirectional mobile robot

    Get PDF
    This paper proposes a novel control system for the path planning of an omnidirectional mobile robot based on mixed reality. Most research on mobile robots is carried out in a completely real environment or a completely virtual environment. However, a real environment containing virtual objects has important actual applications. The proposed system can control the movement of the mobile robot in the real environment, as well as the interaction between the mobile robot’s motion and virtual objects which can be added to a real environment. First, an interactive interface is presented in the mixed reality device HoloLens. The interface can display the map, path, control command, and other information related to the mobile robot, and it can add virtual objects to the real map to realize a real-time interaction between the mobile robot and the virtual objects. Then, the original path planning algorithm, vector field histogram* (VFH*), is modified in the aspects of the threshold, candidate direction selection, and cost function, to make it more suitable for the scene with virtual objects, reduce the number of calculations required, and improve the security. Experimental results demonstrated that this proposed method can generate the motion path of the mobile robot according to the specific requirements of the operator, and achieve a good obstacle avoidance performance

    Programming Robots by Demonstration using Augmented Reality

    Get PDF
    O mundo está a viver a quarta revolução industrial, a Indústria 4.0; marcada pela crescente inteligência e automação dos sistemas industriais. No entanto, existem tarefas que são muito complexas ou caras para serem totalmente automatizadas, seria mais eficiente se a máquina pudesse trabalhar com o ser humano, não apenas partilhando o mesmo espaço de trabalho, mas como colaboradores úteis. O foco da investigação para solucionar esse problema está em sistemas de interação homem-robô, percebendo em que aplicações podem ser úteis para implementar e quais são os desafios que enfrentam. Neste contexto, uma melhor interação entre as máquinas e os operadores pode levar a múltiplos benefícios, como menos, melhor e mais fácil treino, um ambiente mais seguro para o operador e a capacidade de resolver problemas mais rapidamente. O tema desta dissertação é relevante na medida em que é necessário aprender e implementar as tecnologias que mais contribuem para encontrar soluções para um trabalho mais simples e eficiente na indústria. Assim, é proposto o desenvolvimento de um protótipo industrial de um sistema de interação homem-máquina através de Realidade Estendida, no qual o objetivo é habilitar um operador industrial sem experiência em programação, a programar um robô colaborativo utilizando o Microsoft HoloLens 2. O sistema desenvolvido é dividido em duas partes distintas: o sistema de tracking, que regista o movimento das mãos do operador, e o sistema de tradução da programação por demonstração, que constrói o programa a ser enviado ao robô para que ele se mova. O sistema de monitorização e supervisão é executado pelo Microsoft HoloLens 2, utilizando a plataforma Unity e Visual Studio para programá-lo. A base do sistema de programação por demonstração foi desenvolvida em Robot Operating System (ROS). Os robôs incluídos nesta interface são Universal Robots UR5 (robô colaborativo) e ABB IRB 2600 (robô industrial). Adicionalmente, a interface foi construída para incorporar facilmente mais robôs.The world is living the fourth industrial revolution, Industry 4.0; marked by the increasing intelligence and automation of manufacturing systems. Nevertheless, there are types of tasks that are too complex or too expensive to be fully automated, it would be more efficient if the machine were able to work with the human, not only by sharing the same workspace but also as useful collaborators. A possible solution to that problem is on human-robot interactions systems, understanding the applications where they can be helpful to implement and what are the challenges they face. In this context a better interaction between the machines and the operators can lead to multiples benefits, like less, better, and easier training, a safer environment for the operator and the capacity to solve problems quicker. The focus of this dissertation is relevant as it is necessary to learn and implement the technologies which most contribute to find solutions for a simpler and more efficient work in industry. This dissertation proposes the development of an industrial prototype of a human machine interaction system through Extended Reality (XR), in which the objective is to enable an industrial operator without any programming experience to program a collaborative robot using the Microsoft HoloLens 2. The system itself is divided into two different parts: the tracking system, which records the operator's hand movement, and the translator of the programming by demonstration system, which builds the program to be sent to the robot to execute the task. The monitoring and supervision system is executed by the Microsoft HoloLens 2, using the Unity platform and Visual Studio to program it. The programming by demonstration system's core was developed in Robot Operating System (ROS). The robots included in this interface are Universal Robots UR5 (collaborative robot) and ABB IRB 2600 (industrial robot). Moreover, the interface was built to easily add other robots

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement
    • …
    corecore