16 research outputs found

    A natural interface for remote operation of underwater robots

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision-making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. We conducted an experiment and found that the highest user preference and performance was in the immersive condition with joystick navigation.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)

    Virtual reality interface for the guidance of underwater robots

    Get PDF
    Treball final de Grau en Disseny i Desenvolupament de Videojocs. Codi: VJ1241. Curs acadèmic: 2018/2019The main motivation to establish this project was my interest in virtual reality, I am intrigued by the amount of possibilities it can offer and how it can evolve. I also wanted to make an interface that was useful once finished. Thanks to the professor P. J. Sanz, who was willing to guide a project of these characteristics and to his recommendations and help during all the development time we were able to make this project oriented to HRI in underwater interventions

    A New Virtual Reality Interface for Underwater Intervention Missions

    Get PDF
    Ponencia presentada en IFAC-PapersOnLine, Volume 53, Issue 2, 2020, Pages 14600-14607Nowadays, most underwater intervention missions are developed through the well-known work-class ROVs (Remote Operated Vehicles), equipped with teleoperated arms under human supervision. Thus, despite the appearance on the market of the first prototypes of the so-called I-AUV (Autonomous Underwater Vehicles for Intervention), the most mature technology associated with ROVs continues to be trusted. In order to fill the gap between ROVs and incipient I-AUVs technology, new research is under progress in our laboratory. In particular, new HRI (Human Robot Interaction) capabilities are being tested inside a three-year Spanish coordinated project focused on cooperative underwater intervention missions. In this work new results are presented concerning a new user interface which includes immersion capabilities through Virtual Reality (VR) technology. It is worth noting that a new HRI module has been demonstrated, through a pilot study, in which the users had to solve some specific tasks, with minimum guidance and instructions, following simple Problem Based Learning (PBL) scheme. Finally, it is noticeable that, although this is only a work in progress, the obtained results are promising concerning friendly and intuitive characteristics of the developed HRI module. Thus, some critical aspects, like complexity fall, training time and cognitive fatigue of the ROV pilot, seem more affordable now

    Design and evaluation of a natural interface for remote operation of underwater roter

    Get PDF
    Nowadays, an increasing need of intervention robotic systems can be observed in all kind of hazardous environments. In all these intervention systems, the human expert continues playing a central role from the decision making point of view. For instance, in underwater domains, when manipulation capabilities are required, only Remote Operated Vehicles, commercially available, can be used, normally using master-slave architectures and relaying all the responsibility in the pilot. Thus, the role played by human- machine interfaces represents a crucial point in current intervention systems. This paper presents a User Interface Abstraction Layer and introduces a new procedure to control an underwater robot vehicle by using a new intuitive and immersive interface, which will show to the user only the most relevant information about the current mission. Finally, some experiments have been carried out to compare a traditional setup and the new procedure, demonstrating reliability and feasibility of our approach.This research was partly supported by Spanish Ministry of Research and Innovation DPI2011-27977-C03 (TRITON Project)

    Motion simulation of a hexapod robot in virtual reality environments

    Get PDF
    The aim of this work is the development of an animation of a hexapod robot in virtual reality environments through the use of Unity software. The movement of the robot is 18 degrees of freedom, which are based on servo motors that are controlled by the use of a microcontroller; It also shows the processes necessary to transfer the 3D design of the hexapod robot made in the use of CAD (Computer Aided Design) software to a virtual reality environment without losing the physical characteristics of the original design. Finally, the results obtained from the simulation of movement and responses of the hexapod robot to disturbances such as roughness of the floor, uneven ground and gravity among others are presented; allowing a correct evaluation to possible designs of robots before being elaborated

    Motion simulation of a hexapod robot in virtual reality environments

    Get PDF
    The aim of this work is the development of an animation of a hexapod robot in virtual reality environments through the use of Unity software. The movement of the robot is 18 degrees of freedom, which are based on servo motors that are controlled by the use of a microcontroller; It also shows the processes necessary to transfer the 3D design of the hexapod robot made in the use of CAD (Computer Aided Design) software to a virtual reality environment without losing the physical characteristics of the original design. Finally, the results obtained from the simulation of movement and responses of the hexapod robot to disturbances such as roughness of the floor, uneven ground and gravity among others are presented; allowing a correct evaluation to possible designs of robots before being elaborated

    Design of a wearable fingertip haptic device for remote palpation: Characterisation and interface with a virtual environment

    Get PDF
    © 2018 Tzemanaki, Al, Melhuish and Dogramadzi. This paper presents the development of a wearable Fingertip Haptic Device (FHD) that can provide cutaneous feedback via a Variable Compliance Platform (VCP). The FHD includes an inertial measurement unit, which tracks the motion of the user's finger while its haptic functionality relies on two parameters: pressure in the VCP and its linear displacement towards the fingertip. The combination of these two features results in various conditions of the FHD, which emulate the remote object or surface stiffness properties. Such a device can be used in tele-operation, including virtual reality applications, where rendering the level of stiffness of different physical or virtual materials could provide a more realistic haptic perception to the user. The FHD stiffness representation is characterised in terms of resulting pressure and force applied to the fingertip created through the relationship of the two functional parameters - pressure and displacement of the VCP. The FHD was tested in a series of user studies to assess its potential to create a user perception of the object's variable stiffness. The viability of the FHD as a haptic device has been further confirmed by interfacing the users with a virtual environment. The developed virtual environment task required the users to follow a virtual path, identify objects of different hardness on the path and navigate away from "no-go" zones. The task was performed with and without the use of the variable compliance on the FHD. The results showed improved performance with the presence of the variable compliance provided by the FHD in all assessed categories and particularly in the ability to identify correctly between objects of different hardness

    A motion control method for a differential drive robot based on human walking for immersive telepresence

    Get PDF
    Abstract. This thesis introduces an interface for controlling Differential Drive Robots (DDRs) for telepresence applications. Our goal is to enhance immersive experience while reducing user discomfort, when using Head Mounted Displays (HMDs) and body trackers. The robot is equipped with a 360° camera that captures the Robot Environment (RE). Users wear an HMD and use body trackers to navigate within a Local Environment (LE). Through a live video stream from the robot-mounted camera, users perceive the RE within a virtual sphere known as the Virtual Environment (VE). A proportional controller was employed to facilitate the control of the robot, enabling to replicate the movements of the user. The proposed method uses chest tracker to control the telepresence robot and focuses on minimizing vection and rotations induced by the robot’s motion by modifying the VE, such as rotating and translating it. Experimental results demonstrate the accuracy of the robot in reaching target positions when controlled through the body-tracker interface. Additionally, it also reveals an optimal VE size that effectively reduces VR sickness and enhances the sense of presence

    A 360 VR and Wi-Fi Tracking Based Autonomous Telepresence Robot for Virtual Tour

    Get PDF
    This study proposes a novel mobile robot teleoperation interface that demonstrates the applicability of a robot-aided remote telepresence system with a virtual reality (VR) device to a virtual tour scenario. To improve realism and provide an intuitive replica of the remote environment for the user interface, the implemented system automatically moves a mobile robot (viewpoint) while displaying a 360-degree live video streamed from the robot to a VR device (Oculus Rift). Upon the user choosing a destination location from a given set of options, the robot generates a route based on a shortest path graph and travels along that the route using a wireless signal tracking method that depends on measuring the direction of arrival (DOA) of radio signals. This paper presents an overview of the system and architecture, and discusses its implementation aspects. Experimental results show that the proposed system is able to move to the destination stably using the signal tracking method, and that at the same time, the user can remotely control the robot through the VR interface
    corecore