7 research outputs found

    Plenoptische Modellierung und Darstellung komplexer starrer Szenen

    Get PDF
    Image-Based Rendering is the task of generating novel views from existing images. In this thesis different new methods to solve this problem are presented. These methods are designed to fulfil special goals such as scalability and interactive rendering performance. First, the theory of the Plenoptic Function is introduced as the mathematical foundation of image formation. Then a new taxonomy is introduced to categorise existing methods and an extensive overview of known approaches is given. This is followed by a detailed analysis of the design goals and the requirements with regards to input data. It is concluded that for perspectively correct image generation from sparse spatial sampling geometry information about the scene is necessary. This leads to the design of three different Image-Based Rendering methods. The rendering results are analysed on different data sets. For this analysis, error metrics are defined to evaluate different aspects

    Plenoptic Modeling of 3D Scenes with a Sensor-augmented Multi-Camera Rig

    No full text
    We propose a system for robust modeling and visualisation of complex outdoor scenes from multi-camera image sequences and additional sensor information. A camera rig with one or more fire-wire cameras is used in conjunction with a 3-axis rotation sensor to robustly obtain a calibration of the scene with an uncalibrated structure from motion approach. Dense depth maps are estimated with multi-viewpoint stereo and the scene is visualized from a plenoptic representation of the scene

    Distributed Realtime Interaction and Visualization System

    No full text
    A distributed realtime system for immersive visualization is presented which uses distributed interaction for control. We will focus on user tracking with fixed and pan-tilt-zoom cameras, synchronization of multiple interaction devices and distributed synchronized visualization. The system uses only standard hardware and standard network protocols. Furthermore for the realtime visualization we only use consumer graphics hardware

    Architecture and Tracking Algorithms for a Distributed Mobile Industrial AR System

    Get PDF
    In Augmented Reality applications, a 3D object is registered with a camera and visual augmentations of the object are rendered into the users field of view with a head mounted display. For correct rendering, the 3D pose of the users view w.r.t. the 3D object must be registered and tracked in realtime, which is a computational intensive task. This contribution describes a distributed system that allows to track the 3D camera pose and to render images on a light-weight mobile frontend user interface system. The frontend system is connected by WLAN to a backend server that takes over the computational burdon for realtime tracking. We describe the system architecture and the tracking algorithms of our system

    Architecture and Tracking Algorithms for a Distributed Mobile Industrial AR System

    Get PDF
    In Augmented Reality applications, a 3D object is registered with a camera and visual augmentations of the object are rendered into the users field of view with a head mounted display. For correct rendering, the 3D pose of the users view w.r.t. the 3D object must be registered and tracked in realtime, which is a computational intensive task. This contribution describes a distributed system that allows to track the 3D camera pose and to render images on a light-weight mobile frontend user interface system. The frontend system is connected by WLAN to a backend server that takes over the computational burdon for realtime tracking. We describe the system architecture and the tracking algorithms of our system
    corecore