2,058 research outputs found

    Spectral unmixing of Multispectral Lidar signals

    Get PDF
    In this paper, we present a Bayesian approach for spectral unmixing of multispectral Lidar (MSL) data associated with surface reflection from targeted surfaces composed of several known materials. The problem addressed is the estimation of the positions and area distribution of each material. In the Bayesian framework, appropriate prior distributions are assigned to the unknown model parameters and a Markov chain Monte Carlo method is used to sample the resulting posterior distribution. The performance of the proposed algorithm is evaluated using synthetic MSL signals, for which single and multi-layered models are derived. To evaluate the expected estimation performance associated with MSL signal analysis, a Cramer-Rao lower bound associated with model considered is also derived, and compared with the experimental data. Both the theoretical lower bound and the experimental analysis will be of primary assistance in future instrument design

    Lidar waveform based analysis of depth images constructed using sparse single-photon data

    Get PDF
    This paper presents a new Bayesian model and algorithm used for depth and intensity profiling using full waveforms from the time-correlated single photon counting (TCSPC) measurement in the limit of very low photon counts. The model proposed represents each Lidar waveform as a combination of a known impulse response, weighted by the target intensity, and an unknown constant background, corrupted by Poisson noise. Prior knowledge about the problem is embedded in a hierarchical model that describes the dependence structure between the model parameters and their constraints. In particular, a gamma Markov random field (MRF) is used to model the joint distribution of the target intensity, and a second MRF is used to model the distribution of the target depth, which are both expected to exhibit significant spatial correlations. An adaptive Markov chain Monte Carlo algorithm is then proposed to compute the Bayesian estimates of interest and perform Bayesian inference. This algorithm is equipped with a stochastic optimization adaptation mechanism that automatically adjusts the parameters of the MRFs by maximum marginal likelihood estimation. Finally, the benefits of the proposed methodology are demonstrated through a serie of experiments using real data

    Robust Bayesian target detection algorithm for depth imaging from sparse single-photon data

    Get PDF
    This paper presents a new Bayesian model and associated algorithm for depth and intensity profiling using full waveforms from time-correlated single-photon counting (TCSPC) measurements in the limit of very low photon counts (i.e., typically less than 20 photons per pixel). The model represents each Lidar waveform as an unknown constant background level, which is combined in the presence of a target, to a known impulse response weighted by the target intensity and finally corrupted by Poisson noise. The joint target detection and depth imaging problem is expressed as a pixel-wise model selection and estimation problem which is solved using Bayesian inference. Prior knowledge about the problem is embedded in a hierarchical model that describes the dependence structure between the model parameters while accounting for their constraints. In particular, Markov random fields (MRFs) are used to model the joint distribution of the background levels and of the target presence labels, which are both expected to exhibit significant spatial correlations. An adaptive Markov chain Monte Carlo algorithm including reversible-jump updates is then proposed to compute the Bayesian estimates of interest. This algorithm is equipped with a stochastic optimization adaptation mechanism that automatically adjusts the parameters of the MRFs by maximum marginal likelihood estimation. Finally, the benefits of the proposed methodology are demonstrated through a series of experiments using real data.Comment: arXiv admin note: text overlap with arXiv:1507.0251

    Full waveform analysis for long-range 3D imaging laser radar

    Get PDF
    The new generation of 3D imaging systems based on laser radar (ladar) offers significant advantages in defense and security applications. In particular, it is possible to retrieve 3D shape information directly from the scene and separate a target from background or foreground clutter by extracting a narrow depth range from the field of view by range gating, either in the sensor or by postprocessing. We discuss and demonstrate the applicability of full-waveform ladar to produce multilayer 3D imagery, in which each pixel produces a complex temporal response that describes the scene structure. Such complexity caused by multiple and distributed reflection arises in many relevant scenarios, for example in viewing partially occluded targets, through semitransparent materials (e.g., windows) and through distributed reflective media such as foliage. We demonstrate our methodology on 3D image data acquired by a scanning time-of-flight system, developed in our own laboratories, which uses the time-correlated single-photon counting technique

    Recovery of forest canopy parameters by inversion of multispectral LiDAR data

    Get PDF
    We describe the use of Bayesian inference techniques, notably Markov chain Monte Carlo (MCMC) and reversible jump MCMC (RJMCMC) methods, to recover forest structural and biochemical parameters from multispectral LiDAR (Light Detection and Ranging) data. We use a variable dimension, multi-layered model to represent a forest canopy or tree, and discuss the recovery of structure and depth profiles that relate to photochemical properties. We first demonstrate how simple vegetation indices such as the Normalized Differential Vegetation Index (NDVI), which relates to canopy biomass and light absorption, and Photochemical Reflectance Index (PRI) which is a measure of vegetation light use efficiency, can be measured from multispectral data. We further describe and demonstrate our layered approach on single wavelength real data, and on simulated multispectral data derived from real, rather than simulated, data sets. This evaluation shows successful recovery of a subset of parameters, as the complete recovery problem is ill-posed with the available data. We conclude that the approach has promise, and suggest future developments to address the current difficulties in parameter inversion

    3D Target Detection and Spectral Classification for Single-photon LiDAR Data

    Full text link
    3D single-photon LiDAR imaging has an important role in many applications. However, full deployment of this modality will require the analysis of low signal to noise ratio target returns and a very high volume of data. This is particularly evident when imaging through obscurants or in high ambient background light conditions. This paper proposes a multiscale approach for 3D surface detection from the photon timing histogram to permit a significant reduction in data volume. The resulting surfaces are background-free and can be used to infer depth and reflectivity information about the target. We demonstrate this by proposing a hierarchical Bayesian model for 3D reconstruction and spectral classification of multispectral single-photon LiDAR data. The reconstruction method promotes spatial correlation between point-cloud estimates and uses a coordinate gradient descent algorithm for parameter estimation. Results on simulated and real data show the benefits of the proposed target detection and reconstruction approaches when compared to state-of-the-art processing algorithm

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Multiple return separation for a full-field ranger via continuous waveform modelling

    Get PDF
    We present two novel Poisson noise Maximum Likelihood based methods for identifying the individual returns within mixed pixels for Amplitude Modulated Continuous Wave rangers. These methods use the convolutional relationship between signal returns and the recorded data to determine the number, range and intensity of returns within a pixel. One method relies on a continuous piecewise truncated-triangle model for the beat waveform and the other on linear interpolation between translated versions of a sampled waveform. In the single return case both methods provide an improvement in ranging precision over standard Fourier transform based methods and a decrease in overall error in almost every case. We find that it is possible to discriminate between two light sources within a pixel, but local minima and scattered light have a significant impact on ranging precision. Discrimination of two returns requires the ability to take samples at less than 90 phase shifts

    Full waveform LiDAR for adverse weather conditions

    Get PDF
    corecore