5,512 research outputs found

    Single-shot compressed ultrafast photography: a review

    Get PDF
    Compressed ultrafast photography (CUP) is a burgeoning single-shot computational imaging technique that provides an imaging speed as high as 10 trillion frames per second and a sequence depth of up to a few hundred frames. This technique synergizes compressed sensing and the streak camera technique to capture nonrepeatable ultrafast transient events with a single shot. With recent unprecedented technical developments and extensions of this methodology, it has been widely used in ultrafast optical imaging and metrology, ultrafast electron diffraction and microscopy, and information security protection. We review the basic principles of CUP, its recent advances in data acquisition and image reconstruction, its fusions with other modalities, and its unique applications in multiple research fields

    Observation of laser pulse propagation in optical fibers with a SPAD camera

    Get PDF
    Recording processes and events that occur on sub-nanosecond timescales poses a difficult challenge. Conventional ultrafast imaging techniques often rely on long data collection times, which can be due to limited device sensitivity and/or the requirement of scanning the detection system to form an image. In this work, we use a single-photon avalanche detector array camera with pico-second timing accuracy to detect photons scattered by the cladding in optical fibers. We use this method to film supercontinuum generation and track a GHz pulse train in optical fibers. We also show how the limited spatial resolution of the array can be improved with computational imaging. The single-photon sensitivity of the camera and the absence of scanning the detection system results in short total acquisition times, as low as a few seconds depending on light levels. Our results allow us to calculate the group index of different wavelength bands within the supercontinuum generation process. This technology can be applied to a range of applications, e.g., the characterization of ultrafast processes, time-resolved fluorescence imaging, three-dimensional depth imaging, and tracking hidden objects around a corner. © The Author(s) 20171541sciescopu

    The EBEX Balloon-borne Experiment—Gondola, Attitude Control, and Control Software

    Get PDF
    The E and B Experiment (EBEX) was a long-duration balloon-borne instrument designed to measure the polarization of the cosmic microwave background (CMB) radiation. EBEX was the first balloon-borne instrument to implement a kilopixel array of transition edge sensor (TES) bolometric detectors and the first CMB experiment to use the digital version of the frequency domain multiplexing system for readout of the TES array. The scan strategy relied on 40 s peak-to-peak constant-velocity azimuthal scans. We discuss the unique demands on the design and operation of the payload that resulted from these new technologies and the scan strategy. We describe the solutions implemented, including the development of a power system designed to provide a total of at least 2.3 kW, a cooling system to dissipate 590 W consumed by the detectors' readout system, software to manage and handle the data of the kilopixel array, and specialized attitude reconstruction software. We present flight performance data showing faultless management of the TES array, adequate powering and cooling of the readout electronics, and constraint of attitude reconstruction errors such that the spurious B-modes they induced were less than 10% of the CMB B-mode power spectrum with r = 0.05

    Optical Synoptic Telescopes: New Science Frontiers

    Full text link
    Over the past decade, sky surveys such as the Sloan Digital Sky Survey have proven the power of large data sets for answering fundamental astrophysical questions. This observational progress, based on a synergy of advances in telescope construction, detectors, and information technology, has had a dramatic impact on nearly all fields of astronomy, and areas of fundamental physics. The next-generation instruments, and the surveys that will be made with them, will maintain this revolutionary progress. The hardware and computational technical challenges and the exciting science opportunities are attracting scientists and engineers from astronomy, optics, low-light-level detectors, high-energy physics, statistics, and computer science. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. This will be particularly true of discoveries emerging from a new generation of sky surveys. Imaging data from large ground-based active optics telescopes with sufficient etendue can address many scientific missions simultaneously. These new investigations will rely on the statistical precision obtainable with billions of objects. For the first time, the full sky will be surveyed deep and fast, opening a new window on a universe of faint moving and distant exploding objects as well as unraveling the mystery of dark energy.Comment: 12 pages, 7 figure

    Imaging dynamics beneath turbid media via parallelized single-photon detection

    Full text link
    Noninvasive optical imaging through dynamic scattering media has numerous important biomedical applications but still remains a challenging task. While standard methods aim to form images based upon optical absorption or fluorescent emission, it is also well-established that the temporal correlation of scattered coherent light diffuses through tissue much like optical intensity. Few works to date, however, have aimed to experimentally measure and process such data to demonstrate deep-tissue imaging of decorrelation dynamics. In this work, we take advantage of a single-photon avalanche diode (SPAD) array camera, with over one thousand detectors, to simultaneously detect speckle fluctuations at the single-photon level from 12 different phantom tissue surface locations delivered via a customized fiber bundle array. We then apply a deep neural network to convert the acquired single-photon measurements into video of scattering dynamics beneath rapidly decorrelating liquid tissue phantoms. We demonstrate the ability to record video of dynamic events occurring 5-8 mm beneath a decorrelating tissue phantom with mm-scale resolution and at a 2.5-10 Hz frame rate

    Instruments on large optical telescopes -- A case study

    Get PDF
    In the distant past, telescopes were known, first and foremost, for the sizes of their apertures. Advances in technology are now enabling astronomers to build extremely powerful instruments to the extent that instruments have now achieved importance comparable or even exceeding the usual importance accorded to the apertures of the telescopes. However, the cost of successive generations of instruments has risen at a rate noticeably above that of the rate of inflation. Here, given the vast sums of money now being expended on optical telescopes and their instrumentation, I argue that astronomers must undertake "cost-benefit" analysis for future planning. I use the scientific output of the first two decades of the W. M. Keck Observatory as a laboratory for this purpose. I find, in the absence of upgrades, that the time to reach peak paper production for an instrument is about six years. The prime lifetime of instruments (sans upgrades), as measured by citations returns, is about a decade. Well thought out and timely upgrades increase and sometimes even double the useful lifetime. I investigate how well instrument builders are rewarded. I find acknowledgements ranging from almost 100% to as low as 60%. Next, given the increasing cost of operating optical telescopes, the management of existing observatories continue to seek new partnerships. This naturally raises the question "What is the cost of a single night of telescope time". I provide a rational basis to compute this quantity. I then end the paper with some thoughts on the future of large ground-based optical telescopes, bearing in mind the explosion of synoptic precision photometric, astrometric and imaging surveys across the electromagnetic spectrum, the increasing cost of instrumentation and the rise of mega instruments.Comment: Revised from previous submission (typos fixed, table 6 was garbled). Submitted to PAS
    corecore