Ambient Occlusion Baking via a Feed-Forward Neural Network

Abstract

We present a feed-forward neural network approach for ambient occlusion baking in real-time rendering. The idea is based on implementing a multi-layer perceptron that allows a general encoding via regression and an efficient decoding via a simple GPU fragment shader. The non-linear nature of multi-layer perceptrons makes them suitable and effective for capturing nonlinearities described by ambient occlusion values. A multi-layer perceptron is also random-accessible, has a compact size, and can be evaluated efficiently on the GPU. We illustrate our approach of screen-space ambient occlusion based on neural network including its quality, size, and run-time speed

    Similar works