3,702 research outputs found

    Handling photographic imperfections and aliasing in augmented reality

    Get PDF
    In video see-through augmented reality, virtual objects are overlaid over images delivered by a digital video camera. One particular problem of this image mixing process is the fact that the visual appearance of the computer-generated graphics differs strongly from the real background image. In typical augmented reality systems, standard real-time rendering techniques are used for displaying virtual objects. These fast, but relatively simplistic methods create an artificial, almost "plastic-like" look for the graphical elements. In this paper, methods for incorporating two particular camera image effects in virtual overlays are described. The first effect is camera image noise, which is contained in the data delivered by the CCD chip used for capturing the real scene. The second effect is motion blur, which is caused by the temporal integration of color intensities on the CCD chip during fast movements of the camera or observed objects, resulting in a blurred camera image. Graphical objects rendered with standard methods neither contain image noise nor motion blur. This is one of the factors which makes the virtual objects stand out from the camera image and contributes to the perceptual difference between real and virtual scene elements. Here, approaches for mimicking both camera image noise and motion blur in the graphical representation of virtual objects are proposed. An algorithm for generating a realistic imitation of image noise based on a camera calibration step is described. A rendering method which produces motion blur according to the current camera movement is presented. As a by-product of the described rendering pipeline, it becomes possible to perform a smooth blending between virtual objects and the camera image at their boundary. An implementation of the new rendering methods for virtual objects is described, which utilizes the programmability of modern graphics processing units (GPUs) and is capable of delivering real-time frame rates

    Utilizing image guided surgery for user interaction in medical augmented reality

    Get PDF
    The graphical overlay of additional medical information over the patient during a surgical procedure has long been considered one of the most promising applications of augmented reality. While many experimental systems for augmented reality in medicine have reached an advanced state and can deliver high-quality augmented video streams, they usually depend heavily on specialized dedicated hardware. Such dedicated system components, which originally have been designed for engineering applications or VR research, often are ill-suited for use in the clinical practice. We have described a novel medical augmented reality application, which is based almost exclusively on existing, commercially available, and certified medical equipment. In our system, a so-called image guided surgery device is used for tracking a webcam, which delivers the digital video stream of the physical scene that is augmented with the virtual information. In this paper, we show how the capability of the image guided surgery system for tracking surgical instruments can be harnessed for user interaction. Our method enables the user to define points and freely drawn shapes in 3-d and provides selectable menu items, which can be located in immediate proximity to the patient. This eliminates the need for conventional touchscreen- or mouse-based user interaction without requiring additional dedicated hardware like dedicated tracking systems or specialized 3-d input devices. Thus the surgeon can directly interact with the system, without the help of additional personnel. We demonstrate our new input method with an application for creating operation plan sketches directly on the patient in an augmented view

    A pointillism style for the non-photorealistic display of augmented reality scenes

    Get PDF
    The ultimate goal of augmented reality is to provide the user with a view of the surroundings enriched by virtual objects. Practically all augmented reality systems rely on standard real-time rendering methods for generating the images of virtual scene elements. Although such conventional computer graphics algorithms are fast, they often fail to produce sufficiently realistic renderings. The use of simple lighting and shading methods, as well as the lack of knowledge about actual lighting conditions in the real surroundings, cause virtual objects to appear artificial. We have recently proposed a novel approach for generating augmented reality images. Our method is based on the idea of applying stylization techniques for reducing the visual realism of both the camera image and the virtual graphical objects. Special non-photorealistic image filters are applied to the camera video stream. The virtual scene elements are rendered using non-photorealistic rendering methods. Since both the camera image and the virtual objects are stylized in a corresponding way, they appear very similar. As a result, graphical objects can become indistinguishable from the real surroundings. Here, we present a new method for the stylization of augmented reality images. This approach generates a painterly "brush stroke" rendering. The resulting stylized augmented reality video frames look similar to paintings created in the "pointillism" style. We describe the implementation of the camera image filter and the non-photorealistic renderer for virtual objects. These components have been newly designed or adapted for this purpose. They are fast enough for generating augmented reality images in real-time and are customizable. The results obtained using our approach are very promising and show that it improves immersion in augmented reality

    Real-time cartoon-like stylization of AR video streams on the GPU

    Get PDF
    The ultimate goal of many applications of augmented reality is to immerse the user into the augmented scene, which is enriched with virtual models. In order to achieve this immersion, it is necessary to create the visual impression that the graphical objects are a natural part of the user’s environment. Producing this effect with conventional computer graphics algorithms is a complex task. Various rendering artifacts in the three-dimensional graphics create a noticeable visual discrepancy between the real background image and virtual objects. We have recently proposed a novel approach to generating an augmented video stream. With this new method, the output images are a non-photorealistic reproduction of the augmented environment. Special stylization methods are applied to both the background camera image and the virtual objects. This way the visual realism of both the graphical foreground and the real background image is reduced, so that they are less distinguishable from each other. Here, we present a new method for the cartoon-like stylization of augmented reality images, which uses a novel post-processing filter for cartoon-like color segmentation and high-contrast silhouettes. In order to make a fast postprocessing of rendered images possible, the programmability of modern graphics hardware is exploited. We describe an implementation of the algorithm using the OpenGL Shading Language. The system is capable of generating a stylized augmented video stream of high visual quality at real-time frame rates. As an example application, we demonstrate the visualization of dinosaur bone datasets in stylized augmented reality

    A proposal for a simple average-based progressive taxation system

    Get PDF
    This paper is a first theoretical presentation of a simple progressive taxation system. The system is based on two adaptations of one easily calculable formula that is based on the societal average income of the previous year. The system contributes to academic discussions as it is a novel approach. It is a progressive tax that does not discriminate against anyone as the progression increases continuously and the increase in tax payment does not go beyond the additional income. The analysis in the paper shows that the core advantage of the system is its simple, transparent and adaptable mechanism

    Reality Tooning: Fast Non-Photorealism for Augmented Video Streams (poster

    Get PDF
    Recently, we have proposed a novel approach to generating augmented video streams. The output images are a non-photorealistic reproduction of the augmented environment. Special stylization methods are applied to both the background camera image and the virtual objects. This way, the graphical foreground and the real background images are rendered in a similar style, so that they are less distinguishable from each other. Here, we present a new algorithm for the cartoon-like stylization of augmented reality images, which uses a novel post-processing filter for cartoon-like color segmentation and high-contrast silhouettes. In order to make a fast post-processing of rendered images possible, the programmability of modern graphics hardware is exploited. The system is capable of generatin

    Noise thermometry in narrow 2D electron gas heat baths connected to a quasi-1D interferometer

    Full text link
    Thermal voltage noise measurements are performed in order to determine the electron temperature in nanopatterned channels of a GaAs/AlGaAs heterostructure at bath temperatures of 4.2 and 1.4 K. Two narrow two-dimensional (2D) heating channels, close to the transition to the one-dimensional (1D) regime, are connected by a quasi-1D quantum interferometer. Under dc current heating of the electrons in one heating channel, we perform cross-correlated noise measurements locally in the directly heated channel and nonlocally in the other channel, which is indirectly heated by hot electron diffusion across the quasi-1D connection. We observe the same functional dependence of the thermal noise on the heating current. The temperature dependence of the electron energy-loss rate is reduced compared to wider 2D systems. In the quantum interferometer, we show the decoherence due to the diffusion of hot electrons from the heating channel into the quasi-1D system, which causes a thermal gradient.Comment: 6 pages, 5 figure
    corecore