50,994 research outputs found

    Multimodal Man-machine Interface and Virtual Reality for Assistive Medical Systems

    Get PDF
    The results of research the intelligence multimodal man-machine interface and virtual reality means for assistive medical systems including computers and mechatronic systems (robots) are discussed. The gesture translation for disability peoples, the learning-by-showing technology and virtual operating room with 3D visualization are presented in this report and were announced at International exhibition "Intelligent and Adaptive Robots–2005"

    Virtual Borders: Accurate Definition of a Mobile Robot's Workspace Using Augmented Reality

    Full text link
    We address the problem of interactively controlling the workspace of a mobile robot to ensure a human-aware navigation. This is especially of relevance for non-expert users living in human-robot shared spaces, e.g. home environments, since they want to keep the control of their mobile robots, such as vacuum cleaning or companion robots. Therefore, we introduce virtual borders that are respected by a robot while performing its tasks. For this purpose, we employ a RGB-D Google Tango tablet as human-robot interface in combination with an augmented reality application to flexibly define virtual borders. We evaluated our system with 15 non-expert users concerning accuracy, teaching time and correctness and compared the results with other baseline methods based on visual markers and a laser pointer. The experimental results show that our method features an equally high accuracy while reducing the teaching time significantly compared to the baseline methods. This holds for different border lengths, shapes and variations in the teaching process. Finally, we demonstrated the correctness of the approach, i.e. the mobile robot changes its navigational behavior according to the user-defined virtual borders.Comment: Accepted on 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), supplementary video: https://youtu.be/oQO8sQ0JBR

    Multiplayer spill med EV3 mindstorm-roboter

    Get PDF
    Vår oppgave handler om å lage en online "Augmented reality" (AR) opplevelse der brukere kan kontrollere fysiske Lego EV3 Mindstorm roboter for å utføre virtuelle oppgaver. Robotene har blitt utstyrt med kamera foran for å la brukere observere og samhandle med det virtuelle miljøet. Bildeprosessering er brukt for å estimere posisjonen og retningen til robotene. Estimeringene er så brukt til å lage ett virtuelt miljø for å simulere en AR opplevelse.Our thesis is about creating an online augmented reality (AR) experience where users control physical Lego EV3 Mindstorm robots to perform virtual tasks. The robots are equipped with cameras in front to allow users to observe and interact with the environment. Image processing is used to estimate the position and orientation of the robots. The estimation are used with a virtual environment to simulate an AR experience

    Bio-inspired distributed sensors to autonomous search of gas leak source

    Get PDF
    This work presents multiple small robots in an unhealthy industrial environment responsible for detecting harmful gases to humans, avoiding possible harmful effects on the body. Mixed reality is widely used, considering that the environment and gases are virtual and real small robots. Essential components for the experiments are virtual, such as gases and BioCyber-Sensors. The results establish the great potential for applications in several areas, such as industrial, biomedical, and services. The entire system was developed based on ROS (Robot Operating System), thus the ease in diversifying different applications and approaches with multiple agents. The main objective of small robots is to guaranty a healthy work environment.info:eu-repo/semantics/publishedVersio

    Virtual reality interface for the guidance of underwater robots

    Get PDF
    Treball final de Grau en Disseny i Desenvolupament de Videojocs. Codi: VJ1241. Curs acadèmic: 2018/2019The main motivation to establish this project was my interest in virtual reality, I am intrigued by the amount of possibilities it can offer and how it can evolve. I also wanted to make an interface that was useful once finished. Thanks to the professor P. J. Sanz, who was willing to guide a project of these characteristics and to his recommendations and help during all the development time we were able to make this project oriented to HRI in underwater interventions

    Goal Based Human Swarm Interaction for Collaborative Transport

    Get PDF
    Human-swarm interaction is an important milestone for the introduction of swarm-intelligence based solutions into real application scenarios. One of the main hurdles towards this goal is the creation of suitable interfaces for humans to convey the correct intent to multiple robots. As the size of the swarm increases, the complexity of dealing with explicit commands for individual robots becomes intractable. This brings a great challenge for the developer or the operator to drive robots to finish even the most basic tasks. In our work, we consider a different approach that humans specify only the desired goal rather than issuing individual commands necessary to obtain this task. We explore this approach in a collaborative transport scenario, where the user chooses the target position of an object, and a group of robots moves it by adapting themselves to the environment. The main outcome of this thesis is the design of integration of a collaborative transport behavior of swarm robots and an augmented reality human interface. We implemented an augmented reality (AR) application in which a virtual object is displayed overlapped on a detected target object. Users can manipulate the virtual object to generate the goal configuration for the object. The designed centralized controller translate the goal position to the robots and synchronize the state transitions. The whole system is tested on Khepera IV robots through the integration of Vicon system and ARGoS simulator

    Mission planning and remote operated vehicle simulation in a virtual reality interface

    Get PDF
    Virtual reality simulations are finding applications in a wide range of disciplines such as surgical simulation, electronics training, and crime scene investigation. During the Mars Pathfinder Mission, in summer 1997, NASA scientists unveiled a new application of virtual reality for the visualization of a planetary surface. The success of this application led to a more concentrated effort for using virtual reality visualization tools during future missions. The thrust of this effort was to develop a new interface which would allow scientists to interactively plan experiments to be performed by the mission robots. This thesis covers two of the primary aspects of implementing this system. The first topic was to develop a kinematic model for one of NASA\u27s rovers for use in a virtual reality simulation. The second aspect of this thesis is the implementation of the tools required for the mission planning module, which are the interfaces that the scientists use to plan the experiments for the rover
    corecore