2,439 research outputs found

    Robust Bayesian target detection algorithm for depth imaging from sparse single-photon data

    Get PDF
    This paper presents a new Bayesian model and associated algorithm for depth and intensity profiling using full waveforms from time-correlated single-photon counting (TCSPC) measurements in the limit of very low photon counts (i.e., typically less than 20 photons per pixel). The model represents each Lidar waveform as an unknown constant background level, which is combined in the presence of a target, to a known impulse response weighted by the target intensity and finally corrupted by Poisson noise. The joint target detection and depth imaging problem is expressed as a pixel-wise model selection and estimation problem which is solved using Bayesian inference. Prior knowledge about the problem is embedded in a hierarchical model that describes the dependence structure between the model parameters while accounting for their constraints. In particular, Markov random fields (MRFs) are used to model the joint distribution of the background levels and of the target presence labels, which are both expected to exhibit significant spatial correlations. An adaptive Markov chain Monte Carlo algorithm including reversible-jump updates is then proposed to compute the Bayesian estimates of interest. This algorithm is equipped with a stochastic optimization adaptation mechanism that automatically adjusts the parameters of the MRFs by maximum marginal likelihood estimation. Finally, the benefits of the proposed methodology are demonstrated through a series of experiments using real data.Comment: arXiv admin note: text overlap with arXiv:1507.0251

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    3D Target Detection and Spectral Classification for Single-photon LiDAR Data

    Full text link
    3D single-photon LiDAR imaging has an important role in many applications. However, full deployment of this modality will require the analysis of low signal to noise ratio target returns and a very high volume of data. This is particularly evident when imaging through obscurants or in high ambient background light conditions. This paper proposes a multiscale approach for 3D surface detection from the photon timing histogram to permit a significant reduction in data volume. The resulting surfaces are background-free and can be used to infer depth and reflectivity information about the target. We demonstrate this by proposing a hierarchical Bayesian model for 3D reconstruction and spectral classification of multispectral single-photon LiDAR data. The reconstruction method promotes spatial correlation between point-cloud estimates and uses a coordinate gradient descent algorithm for parameter estimation. Results on simulated and real data show the benefits of the proposed target detection and reconstruction approaches when compared to state-of-the-art processing algorithm

    Lidar waveform based analysis of depth images constructed using sparse single-photon data

    Get PDF
    This paper presents a new Bayesian model and algorithm used for depth and intensity profiling using full waveforms from the time-correlated single photon counting (TCSPC) measurement in the limit of very low photon counts. The model proposed represents each Lidar waveform as a combination of a known impulse response, weighted by the target intensity, and an unknown constant background, corrupted by Poisson noise. Prior knowledge about the problem is embedded in a hierarchical model that describes the dependence structure between the model parameters and their constraints. In particular, a gamma Markov random field (MRF) is used to model the joint distribution of the target intensity, and a second MRF is used to model the distribution of the target depth, which are both expected to exhibit significant spatial correlations. An adaptive Markov chain Monte Carlo algorithm is then proposed to compute the Bayesian estimates of interest and perform Bayesian inference. This algorithm is equipped with a stochastic optimization adaptation mechanism that automatically adjusts the parameters of the MRFs by maximum marginal likelihood estimation. Finally, the benefits of the proposed methodology are demonstrated through a serie of experiments using real data

    Fast Surface Detection Using Single-Photon Detection Events

    Get PDF

    Fast surface detection in single-photon Lidar waveforms

    Get PDF

    Robust 3D Reconstruction of Dynamic Scenes From Single-Photon Lidar Using Beta-Divergences

    Get PDF
    In this paper, we present a new algorithm for fast, online 3D reconstruction of dynamic scenes using times of arrival of photons recorded by single-photon detector arrays. One of the main challenges in 3D imaging using single-photon lidar in practical applications is the presence of strong ambient illumination which corrupts the data and can jeopardize the detection of peaks/surface in the signals. This background noise not only complicates the observation model classically used for 3D reconstruction but also the estimation procedure which requires iterative methods. In this work, we consider a new similarity measure for robust depth estimation, which allows us to use a simple observation model and a non-iterative estimation procedure while being robust to mis-specification of the background illumination model. This choice leads to a computationally attractive depth estimation procedure without significant degradation of the reconstruction performance. This new depth estimation procedure is coupled with a spatio-temporal model to capture the natural correlation between neighboring pixels and successive frames for dynamic scene analysis. The resulting online inference process is scalable and well suited for parallel implementation. The benefits of the proposed method are demonstrated through a series of experiments conducted with simulated and real single-photon lidar videos, allowing the analysis of dynamic scenes at 325 m observed under extreme ambient illumination conditions.Comment: 12 page

    Range estimation from single-photon Lidar data using a stochastic EM approach

    Get PDF
    • …
    corecore