2 research outputs found

    A Distributed Software Architecture for Collaborative Teleoperation based on a VR Platform and Web Application Interoperability

    Full text link
    Augmented Reality and Virtual Reality can provide to a Human Operator (HO) a real help to complete complex tasks, such as robot teleoperation and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robot simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on different Mobile platforms to control one or many robots

    Towards multimodal Human-Robot Interaction in large scale virtual environment

    No full text
    International audienceHuman Operators (HO) of telerobotics systems may be able to achieve complex operations with robots. Designing usable and effective Human-Robot Interaction (HRI) is very challenging for system developers and human factors specialists. The search for new metaphors and techniques for HRI adapted to telerobotics systems emerge on the conception of Multimodal HRI (MHRI). MHRI allows to interact naturally and easily with robots due to combination of many devices and an efficient Multimodal Management System (MMS). A system like this should bring a new user's experience in terms of natural interaction, usability, efficiency and flexibility to HRI system. So, a good management of multimodality is very. Moreover, the MMS must be transparent to user in order to be efficient and natural. Empirical evaluation is necessary to have an idea about the goodness of our MMS. We will use an Empirical Evaluation Assistant (EEA) designed in the IBISC laboratory. EEA permits to rapidly gather significant feedbacks about the usability of interaction during the development lifecycle. However the HRI would be classically evaluated by ergonomics experts at the end of its development lifecycle. Results from a preliminary evaluation on a robot teleoperation tasks using the ARITI software framework for assisting the user in piloting the robot, and the IBISC semi-immersive VR/AR platform EVR@, are given. They compare the use of a Flystick and Data Gloves for the 3D interaction with the robot. They show that our MMS is functional although multimodality used in our experiments is not sufficient to provide an efficient Human-Robot Interaction. The EVR@ SPIDAR force feedback will be integrated in our MMS to improve the user's efficiency
    corecore