2,370 research outputs found
Advanced sensors technology survey
This project assesses the state-of-the-art in advanced or 'smart' sensors technology for NASA Life Sciences research applications with an emphasis on those sensors with potential applications on the space station freedom (SSF). The objectives are: (1) to conduct literature reviews on relevant advanced sensor technology; (2) to interview various scientists and engineers in industry, academia, and government who are knowledgeable on this topic; (3) to provide viewpoints and opinions regarding the potential applications of this technology on the SSF; and (4) to provide summary charts of relevant technologies and centers where these technologies are being developed
On evolution of CMOS image sensors
CMOS Image Sensors have become the principal technology in majority of digital cameras. They started replacing the film and Charge Coupled Devices in the last decade with the promise of lower cost, lower power requirement, higher integration and the potential of focal plane processing. However, the principal factor behind their success has been the ability to utilise the shrinkage in CMOS technology to make smaller pixels, and thereby have more resolution without increasing the cost. With the market of image sensors exploding courtesy their inte- gration with communication and computation devices, technology developers improved the CMOS processes to have better optical performance. Nevertheless, the promises of focal plane processing as well as on-chip integration have not been fulfilled. The market is still being pushed by the desire of having higher number of pixels and better image quality, however, differentiation is being difficult for any image sensor manufacturer. In the paper, we will explore potential disruptive growth directions for CMOS Image sensors and ways to achieve the same
Time-to-digital converters and histogram builders in SPAD arrays for pulsed-LiDAR
Light Detection and Ranging (LiDAR) is a 3D imaging technique widely used in many applications such as augmented reality, automotive, machine vision, spacecraft navigation and landing. Pulsed-LiDAR is one of the most diffused LiDAR techniques which relies on the measurement of the round-trip travel time of an optical pulse back-scattered from a distant target. Besides the light source and the detector, Time-to-Digital Converters (TDCs) are fundamental components in pulsed-LiDAR systems, since they allow to measure the back-scattered photon arrival times and their performance directly impact on LiDAR system requirements (i.e., range, precision, and measurements rate). In this work, we present a review of recent TDC architectures suitable to be integrated in SPAD-based CMOS arrays and a review of data processing solutions to derive the TOF information. Furthermore, main TDC parameters and processing techniques are described and analyzed considering pulsed-LiDAR requirements
Fast-Gated 16 x 16 SPAD Array With 16 on-Chip 6 ps Time-to-Digital Converters for Non-Line-of-Sight Imaging
We present the design and characterization of a fully-integrated array of 16 x 16 Single-Photon Avalanche Diodes (SPADs) with fast-gating capabilities and 16 on-chip 6 ps time-to-digital converters, which has been embedded in a compact imaging module. Such sensor has been developed for Non-Line-Of-Sight imaging applications, which require: i) a narrow instrument response function, for a centimeter-accurate single-shot precision; ii) fast-gated SPADs, for time-filtering of directly reflected photons; iii) high photon detection probability, for acquiring faint signals undergoing multiple scattering events. Thanks to a novel multiple differential SPAD-SPAD sensing approach, SPAD detectors can be swiftly activated in less than 500 ps and the full-width at half maximum of the instrument response function is always less than 75 ps (60 ps on average). Temporal responses are consistently uniform throughout the gate window, showing just few picoseconds of time dispersion when 30 ns gate pulses are applied, while the differential non-linearity is as low as 250 fs. With a photon detection probability peak of 70% at 490 nm, a fill-factor of 9.6% and up to 1.6 . 10(8) photon time-tagging measurements per second, such sensor fulfills the demand for fully-integrated imaging solutions optimized for non-line-of-sight imaging applications, enabling to cut exposure times while also optimizing size, weight, power and cost, thus paving the way for further scaled architectures
Classically entangled optical beams for high-speed kinematic sensing
Tracking the kinematics of fast-moving objects is an important diagnostic
tool for science and engineering. Existing optical methods include high-speed
CCD/CMOS imaging, streak cameras, lidar, serial time-encoded imaging and
sequentially timed all-optical mapping. Here, we demonstrate an entirely new
approach to positional and directional sensing based on the concept of
classical entanglement in vector beams of light. The measurement principle
relies on the intrinsic correlations existing in such beams between transverse
spatial modes and polarization. The latter can be determined from intensity
measurements with only a few fast photodiodes, greatly outperforming the
bandwidth of current CCD/CMOS devices. In this way, our setup enables
two-dimensional real-time sensing with temporal resolution in the GHz range. We
expect the concept to open up new directions in photonics-based metrology and
sensing.Comment: v2 includes the real-time measurement from the published version.
Reference [29] added. Minor experimental details added on page
Non-line-of-sight tracking of people at long range
A remote-sensing system that can determine the position of hidden objects has
applications in many critical real-life scenarios, such as search and rescue
missions and safe autonomous driving. Previous work has shown the ability to
range and image objects hidden from the direct line of sight, employing
advanced optical imaging technologies aimed at small objects at short range. In
this work we demonstrate a long-range tracking system based on single laser
illumination and single-pixel single-photon detection. This enables us to track
one or more people hidden from view at a stand-off distance of over 50~m. These
results pave the way towards next generation LiDAR systems that will
reconstruct not only the direct-view scene but also the main elements hidden
behind walls or corners
- …