281 research outputs found

    Self-Calibrating, Fully Differentiable NLOS Inverse Rendering

    Full text link
    Existing time-resolved non-line-of-sight (NLOS) imaging methods reconstruct hidden scenes by inverting the optical paths of indirect illumination measured at visible relay surfaces. These methods are prone to reconstruction artifacts due to inversion ambiguities and capture noise, which are typically mitigated through the manual selection of filtering functions and parameters. We introduce a fully-differentiable end-to-end NLOS inverse rendering pipeline that self-calibrates the imaging parameters during the reconstruction of hidden scenes, using as input only the measured illumination while working both in the time and frequency domains. Our pipeline extracts a geometric representation of the hidden scene from NLOS volumetric intensities and estimates the time-resolved illumination at the relay wall produced by such geometric information using differentiable transient rendering. We then use gradient descent to optimize imaging parameters by minimizing the error between our simulated time-resolved illumination and the measured illumination. Our end-to-end differentiable pipeline couples diffraction-based volumetric NLOS reconstruction with path-space light transport and a simple ray marching technique to extract detailed, dense sets of surface points and normals of hidden scenes. We demonstrate the robustness of our method to consistently reconstruct geometry and albedo, even under significant noise levels

    Detection and Mapping of Specular Surfaces Using Multibounce Lidar Returns

    Full text link
    We propose methods that use specular, multibounce lidar returns to detect and map specular surfaces that might be invisible to conventional lidar systems that rely on direct, single-scatter returns. We derive expressions that relate the time- and angle-of-arrival of these multibounce returns to scattering points on the specular surface, and then use these expressions to formulate techniques for retrieving specular surface geometry when the scene is scanned by a single beam or illuminated with a multi-beam flash. We also consider the special case of transparent specular surfaces, for which surface reflections can be mixed together with light that scatters off of objects lying behind the surface

    Optical Nanoimpacts of Dielectric and Metallic Nanoparticles on Gold Surface by Reflectance Microscopy: Adsorption or Bouncing?

    Get PDF
    International audienceOptical modeling coupled to experiments show that a microscope operating in reflection mode allows imaging, through solutions or even a microfluidic cover, various kinds of nanoparticles, NPs, over a (reflecting) sensing surface, here a gold (Au) surface. Optical modeling suggests that this configuration enables the interferometric imaging of single NPs which can be characterized individually from local change in the surface reflectivity. The interferometric detection improves the optical limit of detection compared to classical configurations exploiting only the light scattered by the NPs. The method is then tested experimentally, to monitor in situ and in real time, the collision of single Brownian NPs, or optical nanoimpacts, with an Au-sensing surface. First, mimicking a microfluidic biosensor platform, the capture of 300 nm FeOx maghemite NPs from a convective flow by a surface-functionalized Au surface is dynamically monitored. Then, the adsorption or bouncing of individual dielectric (100 nm polystyrene) or metallic (40 and 60 nm silver) NPs is observed directly through the solution. The influence of the electrolyte on the ability of NPs to repetitively bounce or irreversibly adsorb onto the Au surface is evidenced. Exploiting such visualization mode of single-NP optical nanoimpacts is insightful for comprehending single-NP electrochemical studies relying on NP collision on an electrode (electrochemical nanoimpacts)

    Recent advances in transient imaging: A computer graphics and vision perspective

    Get PDF
    Transient imaging has recently made a huge impact in the computer graphics and computer vision fields. By capturing, reconstructing, or simulating light transport at extreme temporal resolutions, researchers have proposed novel techniques to show movies of light in motion, see around corners, detect objects in highly-scattering media, or infer material properties from a distance, to name a few. The key idea is to leverage the wealth of information in the temporal domain at the pico or nanosecond resolution, information usually lost during the capture-time temporal integration. This paper presents recent advances in this field of transient imaging from a graphics and vision perspective, including capture techniques, analysis, applications and simulation

    Recent advances in transient imaging: A computer graphics and vision perspective

    Get PDF
    Transient imaging has recently made a huge impact in the computer graphics and computer vision fields. By capturing, reconstructing, or simulating light transport at extreme temporal resolutions, researchers have proposed novel techniques to show movies of light in motion, see around corners, detect objects in highly-scattering media, or infer material properties from a distance, to name a few. The key idea is to leverage the wealth of information in the temporal domain at the pico or nanosecond resolution, information usually lost during the capture-time temporal integration. This paper presents recent advances in this field of transient imaging from a graphics and vision perspective, including capture techniques, analysis, applications and simulation

    Non line of sight imaging using phasor field virtual wave optics

    Get PDF
    Non-line-of-sight imaging allows objects to be observed when partially or fully occluded from direct view, by analysing indirect diffuse reflections off a secondary relay surface. Despite many potential applications1,2,3,4,5,6,7,8,9, existing methods lack practical usability because of limitations including the assumption of single scattering only, ideal diffuse reflectance and lack of occlusions within the hidden scene. By contrast, line-of-sight imaging systems do not impose any assumptions about the imaged scene, despite relying on the mathematically simple processes of linear diffractive wave propagation. Here we show that the problem of non-line-of-sight imaging can also be formulated as one of diffractive wave propagation, by introducing a virtual wave field that we term the phasor field. Non-line-of-sight scenes can be imaged from raw time-of-flight data by applying the mathematical operators that model wave propagation in a conventional line-of-sight imaging system. Our method yields a new class of imaging algorithms that mimic the capabilities of line-of-sight cameras. To demonstrate our technique, we derive three imaging algorithms, modelled after three different line-of-sight systems. These algorithms rely on solving a wave diffraction integral, namely the Rayleigh–Sommerfeld diffraction integral. Fast solutions to Rayleigh–Sommerfeld diffraction and its approximations are readily available, benefiting our method. We demonstrate non-line-of-sight imaging of complex scenes with strong multiple scattering and ambient light, arbitrary materials, large depth range and occlusions. Our method handles these challenging cases without explicitly inverting a light-transport model. We believe that our approach will help to unlock the potential of non-line-of-sight imaging and promote the development of relevant applications not restricted to laboratory conditions
    • …
    corecore