4 research outputs found

    Real-time transition texture synthesis for terrains.

    Get PDF
    Depicting the transitions where differing material textures meet on a terrain surface presents a particularly unique set of challenges in the field of real-time rendering. Natural landscapes are inherently irregular and composed of complex interactions between many different material types of effectively endless detail and variation. Although consumer grade graphics hardware is becoming ever increasingly powerful with each successive generation, terrain texturing remains a trade-off between realism and the computational resources available. Technological constraints aside, there is still the challenge of generating the texture resources to represent terrain surfaces which can often span many hundreds or even thousands of square kilometres. To produce such textures by hand is often impractical when operating on a restricted budget of time and funding. This thesis presents two novel algorithms for generating texture transitions in realtime using automated processes. The first algorithm, Feature-Based Probability Blending (FBPB), automates the task of generating transitions between material textures containing salient features. As such features protrude through the terrain surface FBPB ensures that the topography of these features is maintained at transitions in a realistic manner. The transitions themselves are generated using a probabilistic process that also dynamically adds wear and tear to introduce high frequency detail and irregularity at the transition contour. The second algorithm, Dynamic Patch Transitions (DPT), extends FBPB by applying the probabilistic transition approach to material textures that contain no salient features. By breaking up texture space into a series of layered patches that are either rendered or discarded on a probabilistic basis, the contour of the transition is greatly increased in resolution and irregularity. When used in conjunction with high frequency detail techniques, such as alpha masking, DPT is capable of producing endless, detailed, irregular transitions without the need for artistic input

    The influence of olfaction on the perception of high-fidelity computer graphics

    Get PDF
    The computer graphics industry is constantly demanding more realistic images and animations. However, producing such high quality scenes can take a long time, even days, if rendering on a single PC. One of the approaches that can be used to speed up rendering times is Visual Perception, which exploits the limitations of the Human Visual System, since the viewers of the results will be humans. Although there is an increasing body of research into how haptics and sound may affect a viewer's perception in a virtual environment, the in uence of smell has been largely ignored. The aim of this thesis is to address this gap and make smell an integral part of multi-modal virtual environments. In this work, we have performed four major experiments, with a total of 840 participants. In the experiments we used still images and animations, related and unrelated smells and finally, a multi-modal environment was considered with smell, sound and temperature. Beside this, we also investigated how long it takes for an average person to adapt to smell and what affect there may be when performing a task in the presence of a smell. The results of this thesis clearly show that a smell present in the environment firstly affects the perception of object quality within a rendered image, and secondly, enables parts of the scene or the whole animation to be selectively rendered in high quality while the rest can be rendered in a lower quality without the viewer noticing the drop in quality. Such selective rendering in the presence of smell results in significant computational performance gains without any loss in the quality of the image or animations perceived by a viewer

    Rendering grass terrains in real-time with dynamic lighting

    No full text
    corecore