Panoramic imaging research on geometry recovery and High Dynamic Range (HDR)
reconstruction becomes a trend with the development of Extended Reality (XR).
Neural Radiance Fields (NeRF) provide a promising scene representation for both
tasks without requiring extensive prior data. However, in the case of inputting
sparse Low Dynamic Range (LDR) panoramic images, NeRF often degrades with
under-constrained geometry and is unable to reconstruct HDR radiance from LDR
inputs. We observe that the radiance from each pixel in panoramic images can be
modeled as both a signal to convey scene lighting information and a light
source to illuminate other pixels. Hence, we propose the irradiance fields from
sparse LDR panoramic images, which increases the observation counts for
faithful geometry recovery and leverages the irradiance-radiance attenuation
for HDR reconstruction. Extensive experiments demonstrate that the irradiance
fields outperform state-of-the-art methods on both geometry recovery and HDR
reconstruction and validate their effectiveness. Furthermore, we show a
promising byproduct of spatially-varying lighting estimation. The code is
available at https://github.com/Lu-Zhan/Pano-NeRF