405 research outputs found

    3D Multimodal Interaction with Physically-based Virtual Environments

    Get PDF
    The virtual has become a huge field of exploration for researchers: it could assist the surgeon, help the prototyping of industrial objects, simulate natural phenomena, be a fantastic time machine or entertain users through games or movies. Far beyond the only visual rendering of the virtual environment, the Virtual Reality aims at -literally- immersing the user in the virtual world. VR technologies simulate digital environments with which users can interact and, as a result, perceive through different modalities the effects of their actions in real time. The challenges are huge: the user's motions need to be perceived and to have an immediate impact on the virtual world by modifying the objects in real-time. In addition, the targeted immersion of the user is not only visual: auditory or haptic feedback needs to be taken into account, merging all the sensory modalities of the user into a multimodal answer. The global objective of my research activities is to improve 3D interaction with complex virtual environments by proposing novel approaches for physically-based and multimodal interaction. I have laid the foundations of my work on designing the interactions with complex virtual worlds, referring to a higher demand in the characteristics of the virtual environments. My research could be described within three main research axes inherent to the 3D interaction loop: (1) the physically-based modeling of the virtual world to take into account the complexity of the virtual object behavior, their topology modifications as well as their interactions, (2) the multimodal feedback for combining the sensory modalities into a global answer from the virtual world to the user and (3) the design of body-based 3D interaction techniques and devices for establishing the interfaces between the user and the virtual world. All these contributions could be gathered in a general framework within the 3D interaction loop. By improving all the components of this framework, I aim at proposing approaches that could be used in future virtual reality applications but also more generally in other areas such as medical simulation, gesture training, robotics, virtual prototyping for the industry or web contents.Le virtuel est devenu un vaste champ d'exploration pour la recherche et offre de nos jours de nombreuses possibilités : assister le chirurgien, réaliser des prototypes de pièces industrielles, simuler des phénomènes naturels, remonter dans le temps ou proposer des applications ludiques aux utilisateurs au travers de jeux ou de films. Bien plus que le rendu purement visuel d'environnement virtuel, la réalité virtuelle aspire à -littéralement- immerger l'utilisateur dans le monde virtuel. L'utilisateur peut ainsi interagir avec le contenu numérique et percevoir les effets de ses actions au travers de différents retours sensoriels. Permettre une véritable immersion de l'utilisateur dans des environnements virtuels de plus en plus complexes confronte la recherche en réalité virtuelle à des défis importants: les gestes de l'utilisateur doivent être capturés puis directement transmis au monde virtuel afin de le modifier en temps-réel. Les retours sensoriels ne sont pas uniquement visuels mais doivent être combinés avec les retours auditifs ou haptiques dans une réponse globale multimodale. L'objectif principal de mes activités de recherche consiste à améliorer l'interaction 3D avec des environnements virtuels complexes en proposant de nouvelles approches utilisant la simulation physique et exploitant au mieux les différentes modalités sensorielles. Dans mes travaux, je m'intéresse tout particulièrement à concevoir des interactions avec des mondes virtuels complexes. Mon approche peut être décrite au travers de trois axes principaux de recherche: (1) la modélisation dans les mondes virtuels d'environnements physiques plausibles où les objets réagissent de manière naturelle, même lorsque leur topologie est modifiée ou lorsqu'ils sont en interaction avec d'autres objets, (2) la mise en place de retours sensoriels multimodaux vers l'utilisateur intégrant des composantes visuelles, haptiques et/ou sonores, (3) la prise en compte de l'interaction physique de l'utilisateur avec le monde virtuel dans toute sa richesse : mouvements de la tête, des deux mains, des doigts, des jambes, voire de tout le corps, en concevant de nouveaux dispositifs ou de nouvelles techniques d'interactions 3D. Les différentes contributions que j'ai proposées dans chacun de ces trois axes peuvent être regroupées au sein d'un cadre plus général englobant toute la boucle d'interaction 3D avec les environnements virtuels. Elles ouvrent des perspectives pour de futures applications en réalité virtuelle mais également plus généralement dans d'autres domaines tels que la simulation médicale, l'apprentissage de gestes, la robotique, le prototypage virtuel pour l'industrie ou bien les contenus web

    Design and development of a VR system for exploration of medical data using haptic rendering and high quality visualization

    Get PDF
    [no abstract

    Bi-manual haptic interaction in virtual worlds

    No full text
    In the Virtual Reality field, force-feedback interfaces called haptic interfaces can simulate tactile and kinesthetic interactions. Bi-manual haptic interactions can better immerse users in virtual worlds than one hand interactions and more tasks can be realized such as parallel or precision tasks. Only a few studies deals specifically with bi-manual haptic interactions and previous work mainly extends uni-manual techniques directly to two hands. The document reports possible lacks of bi-manual-specific management of real and virtual workspace and the lack of genericity of solutions using haptic interfaces. The study on bi-manual haptic interactions led to the realization of a framework allowing to use simultaneously several haptic devices. This framework simulates a 3D virtual world coupled with a physical simulation. We realized new specifically bi-manual haptic interaction techniques allowing to control camera, to extend the virtual workspace by a hybrid position/rate control and to help bi-manual pick and place task. The document point out issues such as collision between haptic devices and unification of two different haptic interfaces

    Animation, Simulation, and Control of Soft Characters using Layered Representations and Simplified Physics-based Methods

    Get PDF
    Realistic behavior of computer generated characters is key to bringing virtual environments, computer games, and other interactive applications to life. The plausibility of a virtual scene is strongly influenced by the way objects move around and interact with each other. Traditionally, actions are limited to motion capture driven or pre-scripted motion of the characters. Physics enhance the sense of realism: physical simulation is required to make objects act as expected in real life. To make gaming and virtual environments truly immersive,it is crucial to simulate the response of characters to collisions and to produce secondary effects such as skin wrinkling and muscle bulging. Unfortunately, existing techniques cannot generally achieve these effects in real time, do not address the coupled response of a character's skeleton and skin to collisions nor do they support artistic control. In this dissertation, I present interactive algorithms that enable physical simulation of deformable characters with high surface detail and support for intuitive deformation control. I propose a novel unified framework for real-time modeling of soft objects with skeletal deformations and surface deformation due to contact, and their interplay for object surfaces with up to tens of thousands of degrees of freedom.I make use of layered models to reduce computational complexity. I introduce dynamic deformation textures, which map three dimensional deformations in the deformable skin layer to a two dimensional domain for extremely efficient parallel computation of the dynamic elasticity equations and optimized hierarchical collision detection. I also enhance layered models with responsive contact handling, to support the interplay between skeletal motion and surface contact and the resulting two-way coupling effects. Finally, I present dynamic morph targets, which enable intuitive control of dynamic skin deformations at run-time by simply sculpting pose-specific surface shapes. The resulting framework enables real-time and directable simulation of soft articulated characters with frictional contact response, capturing the interplay between skeletal dynamics and complex,non-linear skin deformations

    6D Frictional Contact for Rigid Bodies

    Get PDF
    International audienceWe present a new approach to modeling contact between rigid objects that augments an individual Coulomb friction point-contact model with rolling and spinning friction constraints. Starting from the intersection volume, we compute a contact normal from the volume gradient. We compute a contact position from the first moment of the intersection volume, and approximate the extent of the contact patch from the second moment of the intersection volume. By incorporating knowledge of the contact patch into a point contact Coulomb friction formulation, we produce a 6D constraint that provides appropriate limits on torques to accommodate displacement of the center of pressure within the contact patch, while also providing a rotational torque due to dry friction to resist spinning. A collection of examples demonstrate the power and benefits of this simple formulation

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion
    • …
    corecore