32 research outputs found

    Stable Real-Time 3D Tracking Using Online and Offline Information

    Get PDF

    Real-time augmented face

    Get PDF
    This real-time augmented reality demonstration relies on our tracking algorithm described in V. Lepetit et al (2003). This algorithm considers natural feature points, and then does not require engineering of the environment. It merges the information from preceding frames in traditional recursive tracking fashion with that provided by a very limited number of reference frames. This combination results in a system that does not suffer from jitter and drift, and can deal with drastic changes. The tracker recovers the full 3D pose of the tracked object, allowing insertion of 3D virtual objects for augmented reality application

    Automatic 3D Facial Expression Analysis in Videos

    Full text link
    We introduce a novel framework for automatic 3D facial expression analysis in videos. Preliminary results demonstrate editing facial expression with facial expression recognition. We first build a 3D expression database to learn the expression space of a human face. The real-time 3D video data were captured by a camera/projector scanning system. From this database, we extract the geometry deformation independent of pose and illumination changes. All possible facial deformations of an individual make a nonlinear manifold embedded in a high dimensional space. To combine the manifolds of different subjects that vary significantly and are usually hard to align, we transfer the facial deformations in all training videos to one standard model. Lipschitz embedding embeds the normalized deformation of the standard model in a low dimensional generalized manifold. We learn a probabilistic expression model on the generalized manifold. To edit a facial expression of a new subject in 3D videos, the system searches over this generalized manifold for optimal replacement with the 'target' expression, which will be blended with the deformation in the previous frames to synthesize images of the new expression with the current head pose. Experimental results show that our method works effectively

    Stable Real-Time Interaction Between Virtual Humans and Real Scenes

    Get PDF
    We present an augmented reality system that relies on purely passive techniques to solve the real-time registration problem. It can run on a portable PC and does not require engineering of the environment, for example by adding markers. To achieve this result, we have integrated robust computer vision techniques into a powerful VR framework. The resulting AR system allows us to produce complex rendering and animation of virtual human characters, and to blend them into the real world. The system tracks the 3D camera position by means of a natural features tracker, which, given a rough CAD model, can deal with complex 3D objects. The tracking method can handle both large camera displacements and aspect changes. We will show that our system works in the cluttered environment of a real industrial facility and can, therefore, be used to enhance manufacturing and industrial processe

    An Investigation of Model Bias in 3D Face Tracking

    No full text
    corecore