4,279 research outputs found

    Frequency-modulated continuous-wave LiDAR compressive depth-mapping

    Get PDF
    We present an inexpensive architecture for converting a frequency-modulated continuous-wave LiDAR system into a compressive-sensing based depth-mapping camera. Instead of raster scanning to obtain depth-maps, compressive sensing is used to significantly reduce the number of measurements. Ideally, our approach requires two difference detectors. % but can operate with only one at the cost of doubling the number of measurments. Due to the large flux entering the detectors, the signal amplification from heterodyne detection, and the effects of background subtraction from compressive sensing, the system can obtain higher signal-to-noise ratios over detector-array based schemes while scanning a scene faster than is possible through raster-scanning. %Moreover, we show how a single total-variation minimization and two fast least-squares minimizations, instead of a single complex nonlinear minimization, can efficiently recover high-resolution depth-maps with minimal computational overhead. Moreover, by efficiently storing only 2m2m data points from m<nm<n measurements of an nn pixel scene, we can easily extract depths by solving only two linear equations with efficient convex-optimization methods

    Compressive Wavefront Sensing with Weak Values

    Get PDF
    We demonstrate a wavefront sensor based on the compressive sensing, single-pixel camera. Using a high-resolution spatial light modulator (SLM) as a variable waveplate, we weakly couple an optical field's transverse-position and polarization degrees of freedom. By placing random, binary patterns on the SLM, polarization serves as a meter for directly measuring random projections of the real and imaginary components of the wavefront. Compressive sensing techniques can then recover the wavefront. We acquire high quality, 256x256 pixel images of the wavefront from only 10,000 projections. Photon-counting detectors give sub-picowatt sensitivity

    Fast Hadamard transforms for compressive sensing of joint systems: measurement of a 3.2 million-dimensional bi-photon probability distribution

    Get PDF
    We demonstrate how to efficiently implement extremely high-dimensional compressive imaging of a bi-photon probability distribution. Our method uses fast-Hadamard-transform Kronecker-based compressive sensing to acquire the joint space distribution. We list, in detail, the operations necessary to enable fast-transform-based matrix-vector operations in the joint space to reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a subspace of that image exists a 3.2 million-dimensional bi-photon probability distribution. In addition, we demonstrate how the marginal distributions can aid in the accuracy of joint space distribution reconstructions

    Soldering iron temperature is automatically reduced

    Get PDF
    Hinged cradle-microswitch arrangement maintains a soldering iron at less than peak temperature when not in use. The microswitch introduces a voltage reducing element into the soldering iron power circuit when the iron is placed on the cradle. The iron, when removed from the cradle, returns to operating temperature in 15 to 30 seconds

    Compressive Direct Imaging of a Billion-Dimensional Optical Phase-Space

    Get PDF
    Optical phase-spaces represent fields of any spatial coherence, and are typically measured through phase-retrieval methods involving a computational inversion, interference, or a resolution-limiting lenslet array. Recently, a weak-values technique demonstrated that a beam's Dirac phase-space is proportional to the measurable complex weak-value, regardless of coherence. These direct measurements require scanning through all possible position-polarization couplings, limiting their dimensionality to less than 100,000. We circumvent these limitations using compressive sensing, a numerical protocol that allows us to undersample, yet efficiently measure high-dimensional phase-spaces. We also propose an improved technique that allows us to directly measure phase-spaces with high spatial resolution and scalable frequency resolution. With this method, we are able to easily measure a 1.07-billion-dimensional phase-space. The distributions are numerically propagated to an object placed in the beam path, with excellent agreement. This protocol has broad implications in signal processing and imaging, including recovery of Fourier amplitudes in any dimension with linear algorithmic solutions and ultra-high dimensional phase-space imaging.Comment: 7 pages, 5 figures. Added new larger dataset and fixed typo

    Position-Momentum Bell-Nonlocality with Entangled Photon Pairs

    Get PDF
    Witnessing continuous-variable Bell nonlocality is a challenging endeavor, but Bell himself showed how one might demonstrate this nonlocality. Though Bell nearly showed a violation using the CHSH inequality with sign-binned position-momentum statistics of entangled pairs of particles measured at different times, his demonstration is subject to approximations not realizable in a laboratory setting. Moreover, he doesn't give a quantitative estimation of the maximum achievable violation for the wavefunction he considers. In this article, we show how his strategy can be reimagined using the transverse positions and momenta of entangled photon pairs measured at different propagation distances, and we find that the maximum achievable violation for the state he considers is actually very small relative to the upper limit of 222\sqrt{2}. Although Bell's wavefunction does not produce a large violation of the CHSH inequality, other states may yet do so.Comment: 6 pages, 3 figure

    Photon counting compressive depth mapping

    Get PDF
    We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 x 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 x 32 pixel real-time video for three-dimensional object tracking at 14 frames-per-second.Comment: 16 pages, 8 figure
    • …
    corecore