67 research outputs found

    Patient-specific simulation environment for surgical planning and preoperative rehearsal

    Get PDF
    Surgical simulation is common practice in the fields of surgical education and training. Numerous surgical simulators are available from commercial and academic organisations for the generic modelling of surgical tasks. However, a simulation platform is still yet to be found that fulfils the key requirements expected for patient-specific surgical simulation of soft tissue, with an effective translation into clinical practice. Patient-specific modelling is possible, but to date has been time-consuming, and consequently costly, because data preparation can be technically demanding. This motivated the research developed herein, which addresses the main challenges of biomechanical modelling for patient-specific surgical simulation. A novel implementation of soft tissue deformation and estimation of the patient-specific intraoperative environment is achieved using a position-based dynamics approach. This modelling approach overcomes the limitations derived from traditional physically-based approaches, by providing a simulation for patient-specific models with visual and physical accuracy, stability and real-time interaction. As a geometrically- based method, a calibration of the simulation parameters is performed and the simulation framework is successfully validated through experimental studies. The capabilities of the simulation platform are demonstrated by the integration of different surgical planning applications that are found relevant in the context of kidney cancer surgery. The simulation of pneumoperitoneum facilitates trocar placement planning and intraoperative surgical navigation. The implementation of deformable ultrasound simulation can assist surgeons in improving their scanning technique and definition of an optimal procedural strategy. Furthermore, the simulation framework has the potential to support the development and assessment of hypotheses that cannot be tested in vivo. Specifically, the evaluation of feedback modalities, as a response to user-model interaction, demonstrates improved performance and justifies the need to integrate a feedback framework in the robot-assisted surgical setting.Open Acces

    Robust iso-surface tracking for interactive character skinning

    Get PDF
    International audienceWe present a novel approach to interactive character skinning, which is robust to extreme character movements, handles skin contacts and produces the effect of skin elasticity (sliding). Our approach builds on the idea of implicit skinning in which the character is approximated by a 3D scalar field and mesh-vertices are appropriately re-projected. Instead of being bound by an initial skinning solution used to initialize the shape at each time step, we use the skin mesh to directly track iso-surfaces of the field over time. Technical problems are two-fold: firstly, all contact surfaces generated between skin parts should be captured as iso-surfaces of the implicit field; secondly, the tracking method should capture elastic skin effects when the joints bend, and as the character returns to its rest shape, so the skin must follow. Our solutions include: new composition operators enabling blending effects and local self-contact between implicit surfaces, as well as a tangential relaxation scheme derived from the as-rigid-as possible energy to solve the tracking problem

    Capturing Hands in Action using Discriminative Salient Points and Physics Simulation

    Full text link
    Hand motion capture is a popular research field, recently gaining more attention due to the ubiquity of RGB-D sensors. However, even most recent approaches focus on the case of a single isolated hand. In this work, we focus on hands that interact with other hands or objects and present a framework that successfully captures motion in such interaction scenarios for both rigid and articulated objects. Our framework combines a generative model with discriminatively trained salient points to achieve a low tracking error and with collision detection and physics simulation to achieve physically plausible estimates even in case of occlusions and missing visual data. Since all components are unified in a single objective function which is almost everywhere differentiable, it can be optimized with standard optimization techniques. Our approach works for monocular RGB-D sequences as well as setups with multiple synchronized RGB cameras. For a qualitative and quantitative evaluation, we captured 29 sequences with a large variety of interactions and up to 150 degrees of freedom.Comment: Accepted for publication by the International Journal of Computer Vision (IJCV) on 16.02.2016 (submitted on 17.10.14). A combination into a single framework of an ECCV'12 multicamera-RGB and a monocular-RGBD GCPR'14 hand tracking paper with several extensions, additional experiments and detail

    Senescence: An Aging based Character Simulation Framework

    Get PDF
    The \u27Senescence\u27 framework is a character simulation plug-in for Maya that can be used for rigging and skinning muscle deformer based humanoid characters with support for aging. The framework was developed using Python, Maya Embedded Language and PyQt. The main targeted users for this framework are the Character Technical Directors, Technical Artists, Riggers and Animators from the production pipeline of Visual Effects Studios. The characters that were simulated using \u27Senescence\u27 were studied using a survey to understand how well the intended age was perceived by the audience. The results of the survey could not reject one of our null hypotheses which means that the difference in the simulated age groups of the character is not perceived well by the participants. But, there is a difference in the perception of simulated age in the character between an Animator and a Non-Animator. Therefore, the difference in the simulated character\u27s age was perceived by an untrained audience, but the audience was unable to relate it to a specific age group

    Animation, Simulation, and Control of Soft Characters using Layered Representations and Simplified Physics-based Methods

    Get PDF
    Realistic behavior of computer generated characters is key to bringing virtual environments, computer games, and other interactive applications to life. The plausibility of a virtual scene is strongly influenced by the way objects move around and interact with each other. Traditionally, actions are limited to motion capture driven or pre-scripted motion of the characters. Physics enhance the sense of realism: physical simulation is required to make objects act as expected in real life. To make gaming and virtual environments truly immersive,it is crucial to simulate the response of characters to collisions and to produce secondary effects such as skin wrinkling and muscle bulging. Unfortunately, existing techniques cannot generally achieve these effects in real time, do not address the coupled response of a character's skeleton and skin to collisions nor do they support artistic control. In this dissertation, I present interactive algorithms that enable physical simulation of deformable characters with high surface detail and support for intuitive deformation control. I propose a novel unified framework for real-time modeling of soft objects with skeletal deformations and surface deformation due to contact, and their interplay for object surfaces with up to tens of thousands of degrees of freedom.I make use of layered models to reduce computational complexity. I introduce dynamic deformation textures, which map three dimensional deformations in the deformable skin layer to a two dimensional domain for extremely efficient parallel computation of the dynamic elasticity equations and optimized hierarchical collision detection. I also enhance layered models with responsive contact handling, to support the interplay between skeletal motion and surface contact and the resulting two-way coupling effects. Finally, I present dynamic morph targets, which enable intuitive control of dynamic skin deformations at run-time by simply sculpting pose-specific surface shapes. The resulting framework enables real-time and directable simulation of soft articulated characters with frictional contact response, capturing the interplay between skeletal dynamics and complex,non-linear skin deformations

    Efficient and Realistic Character Animation through Analytical Physics-based Skin Deformation

    Get PDF
    Physics-based skin deformation methods can greatly improve the realism of character animation, but require non-trivial training, intensive manual intervention, and heavy numerical calculations. Due to these limitations, it is generally time-consuming to implement them, and difficult to achieve a high runtime efficiency. In order to tackle the above limitations caused by numerical calculations of physics-based skin deformation, we propose a simple and efficient analytical approach for physicsbased skin deformations. Specifically, we (1) employ Fourier series to convert 3D mesh models into continuous parametric representations through a conversion algorithm, which largely reduces data size and computing time but still keeps high realism, (2) introduce a partial differential equation (PDE)-based skin deformation model and successfully obtain the first analytical solution to physics-based skin deformations which overcomes the limitations of numerical calculations. Our approach is easy to use, highly efficient, and capable to create physically realistic skin deformations
    • …
    corecore