277 research outputs found

    Bi-manual haptic interaction in virtual worlds

    No full text
    In the Virtual Reality field, force-feedback interfaces called haptic interfaces can simulate tactile and kinesthetic interactions. Bi-manual haptic interactions can better immerse users in virtual worlds than one hand interactions and more tasks can be realized such as parallel or precision tasks. Only a few studies deals specifically with bi-manual haptic interactions and previous work mainly extends uni-manual techniques directly to two hands. The document reports possible lacks of bi-manual-specific management of real and virtual workspace and the lack of genericity of solutions using haptic interfaces. The study on bi-manual haptic interactions led to the realization of a framework allowing to use simultaneously several haptic devices. This framework simulates a 3D virtual world coupled with a physical simulation. We realized new specifically bi-manual haptic interaction techniques allowing to control camera, to extend the virtual workspace by a hybrid position/rate control and to help bi-manual pick and place task. The document point out issues such as collision between haptic devices and unification of two different haptic interfaces

    Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation

    Full text link
    [EN] This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded byMCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant GV/2021/181).Solanes, JE.; Muñoz García, A.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Applied Sciences. 12(12):1-22. https://doi.org/10.3390/app12126071122121

    The Office of the Future: Virtual, Portable, and Global.

    Get PDF
    Virtual reality has the potential to change the way we work. We envision the future office worker to be able to work productively everywhere solely using portable standard input devices and immersive head-mounted displays. Virtual reality has the potential to enable this, by allowing users to create working environments of their choice and by relieving them from physical world limitations, such as constrained space or noisy environments. In this paper, we investigate opportunities and challenges for realizing this vision and discuss implications from recent findings of text entry in virtual reality as a core office task
    • …
    corecore