12,697 research outputs found
Single-shot compressed ultrafast photography: a review
Compressed ultrafast photography (CUP) is a burgeoning single-shot computational imaging technique that provides an imaging speed as high as 10 trillion frames per second and a sequence depth of up to a few hundred frames. This technique synergizes compressed sensing and the streak camera technique to capture nonrepeatable ultrafast transient events with a single shot. With recent unprecedented technical developments and extensions of this methodology, it has been widely used in ultrafast optical imaging and metrology, ultrafast electron diffraction and microscopy, and information security protection. We review the basic principles of CUP, its recent advances in data acquisition and image reconstruction, its fusions with other modalities, and its unique applications in multiple research fields
GPU accelerated real-time multi-functional spectral-domain optical coherence tomography system at 1300 nm.
We present a GPU accelerated multi-functional spectral domain optical coherence tomography system at 1300 nm. The system is capable of real-time processing and display of every intensity image, comprised of 512 pixels by 2048 A-lines acquired at 20 frames per second. The update rate for all four images with size of 512 pixels by 2048 A-lines simultaneously (intensity, phase retardation, flow and en face view) is approximately 10 frames per second. Additionally, we report for the first time the characterization of phase retardation and diattenuation by a sample comprised of a stacked set of polarizing film and wave plate. The calculated optic axis orientation, phase retardation and diattenuation match well with expected values. The speed of each facet of the multi-functional OCT CPU-GPU hybrid acquisition system, intensity, phase retardation, and flow, were separately demonstrated by imaging a horseshoe crab lateral compound eye, a non-uniformly heated chicken muscle, and a microfluidic device. A mouse brain with thin skull preparation was imaged in vivo and demonstrated the capability of the system for live multi-functional OCT visualization
Data compression techniques applied to high resolution high frame rate video technology
An investigation is presented of video data compression applied to microgravity space experiments using High Resolution High Frame Rate Video Technology (HHVT). An extensive survey of methods of video data compression, described in the open literature, was conducted. The survey examines compression methods employing digital computing. The results of the survey are presented. They include a description of each method and assessment of image degradation and video data parameters. An assessment is made of present and near term future technology for implementation of video data compression in high speed imaging system. Results of the assessment are discussed and summarized. The results of a study of a baseline HHVT video system, and approaches for implementation of video data compression, are presented. Case studies of three microgravity experiments are presented and specific compression techniques and implementations are recommended
High-resolution remote thermography using luminescent low-dimensional tin-halide perovskites
While metal-halide perovskites have recently revolutionized research in
optoelectronics through a unique combination of performance and synthetic
simplicity, their low-dimensional counterparts can further expand the field
with hitherto unknown and practically useful optical functionalities. In this
context, we present the strong temperature dependence of the photoluminescence
(PL) lifetime of low-dimensional, perovskite-like tin-halides, and apply this
property to thermal imaging with a high precision of 0.05 {\deg}C. The PL
lifetimes are governed by the heat-assisted de-trapping of self-trapped
excitons, and their values can be varied over several orders of magnitude by
adjusting the temperature (up to 20 ns {\deg}C-1). Typically, this sensitive
range spans up to one hundred centigrade, and it is both compound-specific and
shown to be compositionally and structurally tunable from -100 to 110 {\deg} C
going from [C(NH2)3]2SnBr4 to Cs4SnBr6 and (C4N2H14I)4SnI6. Finally, through
the innovative implementation of cost-effective hardware for fluorescence
lifetime imaging (FLI), based on time-of-flight (ToF) technology, these novel
thermoluminophores have been used to record thermographic videos with high
spatial and thermal resolution.Comment: 25 pages, 4 figure
Quantum-inspired computational imaging
Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip
Study and simulation results for video landmark acquisition and tracking technology (Vilat-2)
The results of several investigations and hardware developments which supported new technology for Earth feature recognition and classification are described. Data analysis techniques and procedures were developed for processing the Feature Identification and Location Experiment (FILE) data. This experiment was flown in November 1981, on the second Shuttle flight and a second instrument, designed for aircraft flights, was flown over the United States in 1981. Ground tests were performed to provide the basis for designing a more advanced version (four spectral bands) of the FILE which would be capable of classifying clouds and snow (and possibly ice) as distinct features, in addition to the features classified in the Shuttle experiment (two spectral bands). The Shuttle instrument classifies water, bare land, vegetation, and clouds/snow/ice (grouped)
Light field super resolution through controlled micro-shifts of light field sensor
Light field cameras enable new capabilities, such as post-capture refocusing
and aperture control, through capturing directional and spatial distribution of
light rays in space. Micro-lens array based light field camera design is often
preferred due to its light transmission efficiency, cost-effectiveness and
compactness. One drawback of the micro-lens array based light field cameras is
low spatial resolution due to the fact that a single sensor is shared to
capture both spatial and angular information. To address the low spatial
resolution issue, we present a light field imaging approach, where multiple
light fields are captured and fused to improve the spatial resolution. For each
capture, the light field sensor is shifted by a pre-determined fraction of a
micro-lens size using an XY translation stage for optimal performance
- …