9,887 research outputs found

    Non-rigid Reconstruction with a Single Moving RGB-D Camera

    Full text link
    We present a novel non-rigid reconstruction method using a moving RGB-D camera. Current approaches use only non-rigid part of the scene and completely ignore the rigid background. Non-rigid parts often lack sufficient geometric and photometric information for tracking large frame-to-frame motion. Our approach uses camera pose estimated from the rigid background for foreground tracking. This enables robust foreground tracking in situations where large frame-to-frame motion occurs. Moreover, we are proposing a multi-scale deformation graph which improves non-rigid tracking without compromising the quality of the reconstruction. We are also contributing a synthetic dataset which is made publically available for evaluating non-rigid reconstruction methods. The dataset provides frame-by-frame ground truth geometry of the scene, the camera trajectory, and masks for background foreground. Experimental results show that our approach is more robust in handling larger frame-to-frame motions and provides better reconstruction compared to state-of-the-art approaches.Comment: Accepted in International Conference on Pattern Recognition (ICPR 2018

    Virtual and Augmented Reality Techniques for Minimally Invasive Cardiac Interventions: Concept, Design, Evaluation and Pre-clinical Implementation

    Get PDF
    While less invasive techniques have been employed for some procedures, most intracardiac interventions are still performed under cardiopulmonary bypass, on the drained, arrested heart. The progress toward off-pump intracardiac interventions has been hampered by the lack of adequate visualization inside the beating heart. This thesis describes the development, assessment, and pre-clinical implementation of a mixed reality environment that integrates pre-operative imaging and modeling with surgical tracking technologies and real-time ultrasound imaging. The intra-operative echo images are augmented with pre-operative representations of the cardiac anatomy and virtual models of the delivery instruments tracked in real time using magnetic tracking technologies. As a result, the otherwise context-less images can now be interpreted within the anatomical context provided by the anatomical models. The virtual models assist the user with the tool-to-target navigation, while real-time ultrasound ensures accurate positioning of the tool on target, providing the surgeon with sufficient information to ``see\u27\u27 and manipulate instruments in absence of direct vision. Several pre-clinical acute evaluation studies have been conducted in vivo on swine models to assess the feasibility of the proposed environment in a clinical context. Following direct access inside the beating heart using the UCI, the proposed mixed reality environment was used to provide the necessary visualization and navigation to position a prosthetic mitral valve on the the native annulus, or to place a repair patch on a created septal defect in vivo in porcine models. Following further development and seamless integration into the clinical workflow, we hope that the proposed mixed reality guidance environment may become a significant milestone toward enabling minimally invasive therapy on the beating heart

    Real-time tracking of 3D elastic objects with an RGB-D sensor

    Get PDF
    This paper presents a method to track in real-time a 3D textureless object which undergoes large deformations such as elastic ones, and rigid motions, using the point cloud data provided by an RGB-D sensor. This solution is expected to be useful for enhanced manipulation of humanoid robotic systems. Our framework relies on a prior visual segmentation of the object in the image. The segmented point cloud is registered first in a rigid manner and then by non-rigidly fitting the mesh, based on the Finite Element Method to model elasticity, and on geometrical point-to-point correspondences to compute external forces exerted on the mesh. The real-time performance of the system is demonstrated on synthetic and real data involving challenging deformations and motions

    Using Haar-like feature classifiers for hand tracking in tabletop augmented reality

    Get PDF
    We propose in this paper a hand interaction approach to Augmented Reality Tabletop applications. We detect the user’s hands using haar-like feature classifiers and correlate its positions with the fixed markers on the table. This gives the user the possibility to move, rotate and resize the virtual objects located over the table with their bare hands.Postprint (published version
    • …
    corecore