34,860 research outputs found
Compressive Wavefront Sensing with Weak Values
We demonstrate a wavefront sensor based on the compressive sensing,
single-pixel camera. Using a high-resolution spatial light modulator (SLM) as a
variable waveplate, we weakly couple an optical field's transverse-position and
polarization degrees of freedom. By placing random, binary patterns on the SLM,
polarization serves as a meter for directly measuring random projections of the
real and imaginary components of the wavefront. Compressive sensing techniques
can then recover the wavefront. We acquire high quality, 256x256 pixel images
of the wavefront from only 10,000 projections. Photon-counting detectors give
sub-picowatt sensitivity
Intensity-only optical compressive imaging using a multiply scattering material and a double phase retrieval approach
In this paper, the problem of compressive imaging is addressed using natural
randomization by means of a multiply scattering medium. To utilize the medium
in this way, its corresponding transmission matrix must be estimated. To
calibrate the imager, we use a digital micromirror device (DMD) as a simple,
cheap, and high-resolution binary intensity modulator. We propose a phase
retrieval algorithm which is well adapted to intensity-only measurements on the
camera, and to the input binary intensity patterns, both to estimate the
complex transmission matrix as well as image reconstruction. We demonstrate
promising experimental results for the proposed algorithm using the MNIST
dataset of handwritten digits as example images
Geometric reconstruction methods for electron tomography
Electron tomography is becoming an increasingly important tool in materials
science for studying the three-dimensional morphologies and chemical
compositions of nanostructures. The image quality obtained by many current
algorithms is seriously affected by the problems of missing wedge artefacts and
nonlinear projection intensities due to diffraction effects. The former refers
to the fact that data cannot be acquired over the full tilt range;
the latter implies that for some orientations, crystalline structures can show
strong contrast changes. To overcome these problems we introduce and discuss
several algorithms from the mathematical fields of geometric and discrete
tomography. The algorithms incorporate geometric prior knowledge (mainly
convexity and homogeneity), which also in principle considerably reduces the
number of tilt angles required. Results are discussed for the reconstruction of
an InAs nanowire
Photon counting compressive depth mapping
We demonstrate a compressed sensing, photon counting lidar system based on
the single-pixel camera. Our technique recovers both depth and intensity maps
from a single under-sampled set of incoherent, linear projections of a scene of
interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional
reconstructions are required to image a three-dimensional scene. We demonstrate
intensity imaging and depth mapping at 256 x 256 pixel transverse resolution
with acquisition times as short as 3 seconds. We also show novelty filtering,
reconstructing only the difference between two instances of a scene. Finally,
we acquire 32 x 32 pixel real-time video for three-dimensional object tracking
at 14 frames-per-second.Comment: 16 pages, 8 figure
Fast Hadamard transforms for compressive sensing of joint systems: measurement of a 3.2 million-dimensional bi-photon probability distribution
We demonstrate how to efficiently implement extremely high-dimensional
compressive imaging of a bi-photon probability distribution. Our method uses
fast-Hadamard-transform Kronecker-based compressive sensing to acquire the
joint space distribution. We list, in detail, the operations necessary to
enable fast-transform-based matrix-vector operations in the joint space to
reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a
subspace of that image exists a 3.2 million-dimensional bi-photon probability
distribution. In addition, we demonstrate how the marginal distributions can
aid in the accuracy of joint space distribution reconstructions
Compressively characterizing high-dimensional entangled states with complementary, random filtering
The resources needed to conventionally characterize a quantum system are
overwhelmingly large for high- dimensional systems. This obstacle may be
overcome by abandoning traditional cornerstones of quantum measurement, such as
general quantum states, strong projective measurement, and assumption-free
characterization. Following this reasoning, we demonstrate an efficient
technique for characterizing high-dimensional, spatial entanglement with one
set of measurements. We recover sharp distributions with local, random
filtering of the same ensemble in momentum followed by position---something the
uncertainty principle forbids for projective measurements. Exploiting the
expectation that entangled signals are highly correlated, we use fewer than
5,000 measurements to characterize a 65, 536-dimensional state. Finally, we use
entropic inequalities to witness entanglement without a density matrix. Our
method represents the sea change unfolding in quantum measurement where methods
influenced by the information theory and signal-processing communities replace
unscalable, brute-force techniques---a progression previously followed by
classical sensing.Comment: 13 pages, 7 figure
Efficient binary tomographic reconstruction
Tomographic reconstruction of a binary image from few projections is
considered. A novel {\em heuristic} algorithm is proposed, the central element
of which is a nonlinear transformation of the
probability that a pixel of the sought image be 1-valued. It consists of
backprojections based on and iterative corrections. Application of
this algorithm to a series of artificial test cases leads to exact binary
reconstructions, (i.e recovery of the binary image for each single pixel) from
the knowledge of projection data over a few directions. Images up to
pixels are reconstructed in a few seconds. A series of test cases is performed
for comparison with previous methods, showing a better efficiency and reduced
computation times
- …