13 research outputs found

    Results for the paper "Object Depth Profile and Reflectivity Restoration from Sparse Single-Photon Data Acquired in Underwater Environments"

    No full text
    This is a Matlab file generating the results of the IEEE-TCI paper entitled "Object Depth Profile and Reflectivity Restoration from Sparse Single-Photon Data Acquired in Underwater Environments

    Results for the paper "Object Depth Profile and Reflectivity Restoration from Sparse Single-Photon Data Acquired in Underwater Environments"

    No full text
    This is a Matlab file generating the results of the IEEE-TCI paper entitled "Object Depth Profile and Reflectivity Restoration from Sparse Single-Photon Data Acquired in Underwater Environments

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Long-range depth imaging using a single-photon detector array and non-local data fusion

    Get PDF
    The ability to measure and record high-resolution depth images at long stand-off distances is important for a wide range of applications, including connected and automotive vehicles, defense and security, and agriculture and mining. In LIDAR (light detection and ranging) applications, single-photon sensitive detection is an emerging approach, offering high sensitivity to light and picosecond temporal resolution, and consequently excellent surface-to-surface resolution. The use of large format CMOS (complementary metal-oxide semiconductor) single-photon detector arrays provides high spatial resolution and allows the timing information to be acquired simultaneously across many pixels. In this work, we combine state-of-the-art single-photon detector array technology with non-local data fusion to generate high resolution three-dimensional depth information of long-range targets. The system is based on a visible pulsed illumination system at a wavelength of 670 nm and a 240 × 320 array sensor, achieving sub-centimeter precision in all three spatial dimensions at a distance of 150 meters. The non-local data fusion combines information from an optical image with sparse sampling of the single-photon array data, providing accurate depth information at low signature regions of the target

    3D LIDAR imaging using Ge-on-Si single–photon avalanche diode detectors

    Get PDF
    We present a scanning light detection and ranging (LIDAR) system incorporating an individual Ge-on-Si single-photon avalanche diode (SPAD) detector for depth and intensity imaging in the short-wavelength infrared region. The time-correlated single-photon counting technique was used to determine the return photon time-of-flight for target depth information. In laboratory demonstrations, depth and intensity reconstructions were made of targets at short range, using advanced image processing algorithms tailored for the analysis of single–photon time-of-flight data. These laboratory measurements were used to predict the performance of the single-photon LIDAR system at longer ranges, providing estimations that sub-milliwatt average power levels would be required for kilometer range depth measurements

    Beyond Binomial and Negative Binomial: Adaptation in Bernoulli Parameter Estimation

    Full text link
    Estimating the parameter of a Bernoulli process arises in many applications, including photon-efficient active imaging where each illumination period is regarded as a single Bernoulli trial. Motivated by acquisition efficiency when multiple Bernoulli processes are of interest, we formulate the allocation of trials under a constraint on the mean as an optimal resource allocation problem. An oracle-aided trial allocation demonstrates that there can be a significant advantage from varying the allocation for different processes and inspires a simple trial allocation gain quantity. Motivated by realizing this gain without an oracle, we present a trellis-based framework for representing and optimizing stopping rules. Considering the convenient case of Beta priors, three implementable stopping rules with similar performances are explored, and the simplest of these is shown to asymptotically achieve the oracle-aided trial allocation. These approaches are further extended to estimating functions of a Bernoulli parameter. In simulations inspired by realistic active imaging scenarios, we demonstrate significant mean-squared error improvements: up to 4.36 dB for the estimation of p and up to 1.80 dB for the estimation of log p.Comment: 13 pages, 16 figure

    Beyond binomial and negative binomial: adaptation in Bernoulli parameter estimation

    Full text link
    Estimating the parameter of a Bernoulli process arises in many applications, including photon-efficient active imaging where each illumination period is regarded as a single Bernoulli trial. Motivated by acquisition efficiency when multiple Bernoulli processes (e.g., multiple pixels) are of interest, we formulate the allocation of trials under a constraint on the mean as an optimal resource allocation problem. An oracle-aided trial allocation demonstrates that there can be a significant advantage from varying the allocation for different processes and inspires the introduction of a simple trial allocation gain quantity. Motivated by achieving this gain without an oracle, we present a trellis-based framework for representing and optimizing stopping rules. Considering the convenient case of Beta priors, three implementable stopping rules with similar performances are explored, and the simplest of these is shown to asymptotically achieve the oracle-aided trial allocation. These approaches are further extended to estimating functions of a Bernoulli parameter. In simulations inspired by realistic active imaging scenarios, we demonstrate significant mean-squared error improvements up to 4.36 dB for the estimation of p and up to 1.86 dB for the estimation of log p.https://arxiv.org/abs/1809.08801https://arxiv.org/abs/1809.08801First author draf

    Single-photon detection techniques for underwater imaging

    Get PDF
    This Thesis investigates the potential of a single-photon depth profiling system for imaging in highly scattering underwater environments. This scanning system measured depth using the time-of-flight and the time-correlated single-photon counting (TCSPC) technique. The system comprised a pulsed laser source, a monostatic scanning transceiver, with a silicon single-photon avalanche diode (SPAD) used for detection of the returned optical signal. Spectral transmittance measurements were performed on a number of different water samples in order to characterize the water types used in the experiments. This identified an optimum operational wavelength for each environment selected, which was in the wavelength region of 525 - 690 nm. Then, depth profiles measurements were performed in different scattering conditions, demonstrating high-resolution image re-construction for targets placed at stand-off distances up to nine attenuation lengths, using average optical power in the sub-milliwatt range. Depth and spatial resolution were investigated in several environments, demonstrating a depth resolution in the range of 500 μm to a few millimetres depending on the attenuation level of the medium. The angular resolution of the system was approximately 60 μrad in water with different levels of attenuation, illustrating that the narrow field of view helped preserve spatial resolution in the presence of high levels of forward scattering. Bespoke algorithms were developed for image reconstruction in order to recover depth, intensity and reflectivity information, and to investigate shorter acquisition times, illustrating the practicality of the approach for rapid frame rates. In addition, advanced signal processing approaches were used to investigate the potential of multispectral single-photon depth imaging in target discrimination and recognition, in free-space and underwater environments. Finally, a LiDAR model was developed and validated using experimental data. The model was used to estimate the performance of the system under a variety of scattering conditions and system parameters

    Object Depth Profile and Reflectivity Restoration from Sparse Single-Photon Data Acquired in Underwater Environments

    No full text
    This paper presents two new algorithms for the joint restoration of depth and reflectivity (DR) images constructed from time-correlated single-photon counting (TCSPC) measurements. Two extreme cases are considered: (i) a reduced acquisition time that leads to very low photon counts and (ii) a highly attenuating environment (such as a turbid medium) which makes the reflectivity estimation more difficult at increasing range. Adopting a Bayesian approach, the Poisson distributed observations are combined with prior distributions about the parameters of interest, to build the joint posterior distribution. More precisely, two Markov random field (MRF) priors enforcing spatial correlations are assigned to the DR images. Under some justified assumptions, the restoration problem (regularized likelihood) reduces to a convex formulation with respect to each of the parameters of interest. This problem is first solved using an adaptive Markov chain Monte Carlo (MCMC) algorithm that approximates the minimum mean square parameter estimators. This algorithm is fully automatic since it adjusts the parameters of the MRFs by maximum marginal likelihood estimation. However, the MCMC-based algorithm exhibits a relatively long computational time. The second algorithm deals with this issue and is based on a coordinate descent algorithm. Results on single-photon depth data from laboratory based underwater measurements demonstrate the benefit of the proposed strategy that improves the quality of the estimated DR images
    corecore