40 research outputs found

    The Virtual Mitten: A novel interaction paradigm for visuo-haptic manipulation of objects using grip force

    Get PDF
    International audienceIn this paper, we propose a novel visuo-haptic interaction paradigm called the "Virtual Mitten" for simulating the 3D manipulation of objects. Our approach introduces an elastic handheld device that provides a passive haptic feedback through the fingers and a mitten interaction metaphor that enables to grasp and manipulate objects. The grasping performed by the mitten is directly correlated with the grip force applied on the elastic device and a supplementary pseudo-haptic feedback modulates the visual feedback of the interaction in order to simulate different haptic perceptions. The Virtual Mitten allows natural interaction and grants users with an extended freedom of movement compared with rigid devices with limited workspaces. Our approach has been evaluated within two experiments focusing both on subjective appreciation and perception. Our results show that participants were able to well perceive different levels of effort during basic manipulation tasks thanks to our pseudo-haptic approach. They could also rapidly appreciate how to achieve different actions with the Virtual Mitten such as opening a drawer or pulling a lever. Taken together, our results suggest that our novel interaction paradigm could be used in a wide range of applications involving one or two-hand haptic manipulation such as virtual prototyping, virtual training or video games

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilitĂ©s : assister le chirurgien, rĂ©aliser des prototypes de piĂšces industrielles, simuler des phĂ©nomĂšnes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la rĂ©alitĂ© virtuelle aspire Ă  -littĂ©ralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numĂ©rique et percevoir les effets de ses actions au travers de diffĂ©rents retours sensoriels. Permettre une vĂ©ritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en rĂ©alitĂ© virtuelle Ă  des dĂ©fis importants: les gestes de l'utilisateur doivent ĂȘtre capturĂ©s puis directement transmis au monde virtuel afin de le modifier en temps-rĂ©el. Les retours sensoriels ne sont pas uniquement visuels mais doivent ĂȘtre combinĂ©s avec les retours auditifs ou haptiques dans une rĂ©ponse globale multimodale. L'objectif principal de mes activitĂ©s de recherche consiste Ă  amĂ©liorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les diffĂ©rentes modalitĂ©s sensorielles. Dans mes travaux, je m'intĂ©resse tout particuliĂšrement Ă  concevoir des interactions avec des mondes virtuels complexes. Mon approche peut ĂȘtre dĂ©crite au travers de trois axes principaux de recherche: (1) la modĂ©lisation dans les mondes virtuels d'environnements physiques plausibles oĂč les objets rĂ©agissent de maniĂšre naturelle, mĂȘme lorsque leur topologie est modifiĂ©e ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intĂ©grant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tĂȘte, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les diffĂ©rentes contributions que j'ai proposĂ©es dans chacun de ces trois axes peuvent ĂȘtre regroupĂ©es au sein d'un cadre plus gĂ©nĂ©ral englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en rĂ©alitĂ© virtuelle mais Ă©galement plus gĂ©nĂ©ralement dans d'autres domaines tels que la simulation mĂ©dicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    A Systematic Review of Weight Perception in Virtual Reality: Techniques, Challenges, and Road Ahead

    Get PDF
    Weight is perceived through the combination of multiple sensory systems, and a wide range of factors – including touch, visual, and force senses – can influence the perception of heaviness. There have been remarkable advancements in the development of haptic interfaces throughout the years. However, a number of challenges limit the progression to enable humans to sense the weight in virtual reality (VR). This article presents an overview of the factors that influence how weight is perceived and the phenomenon that contributes to various types of weight illusions. A systematic review has been undertaken to assess the development of weight perception in VR, underlying haptic technology that renders the mass of a virtual object, and the creation of weight perception through pseudo-haptic. We summarize the approaches from the perspective of haptic and pseudo-haptic cues that exhibit the sense of weight such as force, skin deformation, vibration, inertia, control–display ratio, velocity, body gestures, and audio–visual representation. The design challenges are underlined, and research gaps are discussed, including accuracy and precision, weight discrimination, heavyweight rendering, and absolute weight simulation. This article is anticipated to aid in the development of more realistic weight perception in VR and stimulated new research interest in this topic

    Investigating Precise Control in Spatial Interactions: Proxemics, Kinesthetics, and Analytics

    Get PDF
    Augmented and Virtual Reality (AR/VR) technologies have reshaped the way in which we perceive the virtual world. In fact, recent technological advancements provide experiences that make the physical and virtual worlds almost indistinguishable. However, the physical world affords subtle sensorimotor cues which we subconsciously utilize to perform simple and complex tasks in our daily lives. The lack of this affordance in existing AR/VR systems makes it difficult for their mainstream adoption over conventional 2D2D user interfaces. As a case in point, existing spatial user interfaces (SUI) lack the intuition to perform tasks in a manner that is perceptually familiar to the physical world. The broader goal of this dissertation lies in facilitating an intuitive spatial manipulation experience, specifically for motor control. We begin by investigating the role of proximity to an action on precise motor control in spatial tasks. We do so by introducing a new SUI called the Clock-Maker's Work-Space (CMWS), with the goal of enabling precise actions close to the body, akin to the physical world. On evaluating our setup in comparison to conventional mixed-reality interfaces, we find CMWS to afford precise actions for bi-manual spatial tasks. We further compare our SUI with a physical manipulation task and observe similarities in user behavior across both tasks. We subsequently narrow our focus on studying precise spatial rotation. We utilize haptics, specifically force-feedback (kinesthetics) for augmenting fine motor control in spatial rotational task. By designing three kinesthetic rotation metaphors, we evaluate precise rotational control with and without haptic feedback for 3D shape manipulation. Our results show that haptics-based rotation algorithms allow for precise motor control in 3D space, also, help reduce hand fatigue. In order to understand precise control in its truest form, we investigate orthopedic surgery training from the point of analyzing bone-drilling tasks. We designed a hybrid physical-virtual simulator for bone-drilling training and collected physical data for analyzing precise drilling action. We also developed a Laplacian based performance metric to help expert surgeons evaluate the resident training progress across successive years of orthopedic residency
    corecore