4,459 research outputs found
Photon-Efficient Computational 3D and Reflectivity Imaging with Single-Photon Detectors
Capturing depth and reflectivity images at low light levels from active
illumination of a scene has wide-ranging applications. Conventionally, even
with single-photon detectors, hundreds of photon detections are needed at each
pixel to mitigate Poisson noise. We develop a robust method for estimating
depth and reflectivity using on the order of 1 detected photon per pixel
averaged over the scene. Our computational imager combines physically accurate
single-photon counting statistics with exploitation of the spatial correlations
present in real-world reflectivity and 3D structure. Experiments conducted in
the presence of strong background light demonstrate that our computational
imager is able to accurately recover scene depth and reflectivity, while
traditional maximum-likelihood based imaging methods lead to estimates that are
highly noisy. Our framework increases photon efficiency 100-fold over
traditional processing and also improves, somewhat, upon first-photon imaging
under a total acquisition time constraint in raster-scanned operation. Thus our
new imager will be useful for rapid, low-power, and noise-tolerant active
optical imaging, and its fixed dwell time will facilitate parallelization
through use of a detector array.Comment: 11 pages, 8 figure
Computational multi-depth single-photon imaging
We present an imaging framework that is able to accurately reconstruct multiple depths at individual pixels from single-photon observations. Our active imaging method models the single-photon detection statistics from multiple reflectors within a pixel, and it also exploits the fact that a multi-depth profile at each pixel can be expressed as a sparse signal. We interpret the multi-depth reconstruction problem as a sparse deconvolution problem using single-photon observations, create a convex problem through discretization and relaxation, and use a modified iterative shrinkage-thresholding algorithm to efficiently solve for the optimal multi-depth solution. We experimentally demonstrate that the proposed framework is able to accurately reconstruct the depth features of an object that is behind a partially-reflecting scatterer and 4 m away from the imager with root mean-square error of 11 cm, using only 19 signal photon detections per pixel in the presence of moderate background light. In terms of root mean-square error, this is a factor of 4.2 improvement over the conventional method of Gaussian-mixture fitting for multi-depth recovery.This material is based upon work supported in part by a Samsung Scholarship, the US National Science Foundation under Grant No. 1422034, and the MIT Lincoln Laboratory Advanced Concepts Committee. We thank Dheera Venkatraman for his assistance with the experiments. (Samsung Scholarship; 1422034 - US National Science Foundation; MIT Lincoln Laboratory Advanced Concepts Committee)Accepted manuscrip
Revealing hidden scenes by photon-efficient occlusion-based opportunistic active imaging
The ability to see around corners, i.e., recover details of a hidden scene
from its reflections in the surrounding environment, is of considerable
interest in a wide range of applications. However, the diffuse nature of light
reflected from typical surfaces leads to mixing of spatial information in the
collected light, precluding useful scene reconstruction. Here, we employ a
computational imaging technique that opportunistically exploits the presence of
occluding objects, which obstruct probe-light propagation in the hidden scene,
to undo the mixing and greatly improve scene recovery. Importantly, our
technique obviates the need for the ultrafast time-of-flight measurements
employed by most previous approaches to hidden-scene imaging. Moreover, it does
so in a photon-efficient manner based on an accurate forward model and a
computational algorithm that, together, respect the physics of three-bounce
light propagation and single-photon detection. Using our methodology, we
demonstrate reconstruction of hidden-surface reflectivity patterns in a
meter-scale environment from non-time-resolved measurements. Ultimately, our
technique represents an instance of a rich and promising new imaging modality
with important potential implications for imaging science.Comment: Related theory in arXiv:1711.0629
Quantum-inspired computational imaging
Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip
A Few Photons Among Many: Unmixing Signal and Noise for Photon-Efficient Active Imaging
Conventional LIDAR systems require hundreds or thousands of photon detections
to form accurate depth and reflectivity images. Recent photon-efficient
computational imaging methods are remarkably effective with only 1.0 to 3.0
detected photons per pixel, but they are not demonstrated at
signal-to-background ratio (SBR) below 1.0 because their imaging accuracies
degrade significantly in the presence of high background noise. We introduce a
new approach to depth and reflectivity estimation that focuses on unmixing
contributions from signal and noise sources. At each pixel in an image,
short-duration range gates are adaptively determined and applied to remove
detections likely to be due to noise. For pixels with too few detections to
perform this censoring accurately, we borrow data from neighboring pixels to
improve depth estimates, where the neighborhood formation is also adaptive to
scene content. Algorithm performance is demonstrated on experimental data at
varying levels of noise. Results show improved performance of both reflectivity
and depth estimates over state-of-the-art methods, especially at low
signal-to-background ratios. In particular, accurate imaging is demonstrated
with SBR as low as 0.04. This validation of a photon-efficient, noise-tolerant
method demonstrates the viability of rapid, long-range, and low-power LIDAR
imaging
3D Computational Ghost Imaging
Computational ghost imaging retrieves the spatial information of a scene
using a single pixel detector. By projecting a series of known random patterns
and measuring the back reflected intensity for each one, it is possible to
reconstruct a 2D image of the scene. In this work we overcome previous
limitations of computational ghost imaging and capture the 3D spatial form of
an object by using several single pixel detectors in different locations. From
each detector we derive a 2D image of the object that appears to be illuminated
from a different direction, using only a single digital projector as
illumination. Comparing the shading of the images allows the surface gradient
and hence the 3D form of the object to be reconstructed. We compare our result
to that obtained from a stereo- photogrammetric system utilizing multiple high
resolution cameras. Our low cost approach is compatible with consumer
applications and can readily be extended to non-visible wavebands.Comment: 13pages, 4figure
Performance analysis of low-flux least-squares single-pixel imaging
A single-pixel camera is able to computationally form spatially resolved images using one photodetector and a spatial light modulator. The images it produces in low-light-level operation are imperfect, even when the number of measurements exceeds the number of pixels, because its photodetection measurements are corrupted by Poisson noise. Conventional performance analysis for single-pixel imaging generates estimates of mean-square error (MSE) from Monte Carlo simulations, which require long computational times. In this letter, we use random matrix theory to develop a closed-form approximation to the MSE of the widely used least-squares inversion method for Poisson noise-limited single-pixel imaging. We present numerical experiments that validate our approximation and a motivating example showing how our framework can be used to answer practical optical design questions for a single-pixel camera.This work was supported in part by the Samsung Scholarship and in part by the US National Science Foundation under Grant 1422034. (Samsung Scholarship; 1422034 - US National Science Foundation)Accepted manuscrip
Robust Bayesian target detection algorithm for depth imaging from sparse single-photon data
This paper presents a new Bayesian model and associated algorithm for depth
and intensity profiling using full waveforms from time-correlated single-photon
counting (TCSPC) measurements in the limit of very low photon counts (i.e.,
typically less than 20 photons per pixel). The model represents each Lidar
waveform as an unknown constant background level, which is combined in the
presence of a target, to a known impulse response weighted by the target
intensity and finally corrupted by Poisson noise. The joint target detection
and depth imaging problem is expressed as a pixel-wise model selection and
estimation problem which is solved using Bayesian inference. Prior knowledge
about the problem is embedded in a hierarchical model that describes the
dependence structure between the model parameters while accounting for their
constraints. In particular, Markov random fields (MRFs) are used to model the
joint distribution of the background levels and of the target presence labels,
which are both expected to exhibit significant spatial correlations. An
adaptive Markov chain Monte Carlo algorithm including reversible-jump updates
is then proposed to compute the Bayesian estimates of interest. This algorithm
is equipped with a stochastic optimization adaptation mechanism that
automatically adjusts the parameters of the MRFs by maximum marginal likelihood
estimation. Finally, the benefits of the proposed methodology are demonstrated
through a series of experiments using real data.Comment: arXiv admin note: text overlap with arXiv:1507.0251
- …