1,938 research outputs found

    Hand gesture recognition based on signals cross-correlation

    Get PDF

    Smart cmos image sensor for 3d measurement

    Get PDF
    3D measurements are concerned with extracting visual information from the geometry of visible surfaces and interpreting the 3D coordinate data thus obtained, to detect or track the position or reconstruct the profile of an object, often in real time. These systems necessitate image sensors with high accuracy of position estimation and high frame rate of data processing for handling large volumes of data. A standard imager cannot address the requirements of fast image acquisition and processing, which are the two figures of merit for 3D measurements. Hence, dedicated VLSI imager architectures are indispensable for designing these high performance sensors. CMOS imaging technology provides potential to integrate image processing algorithms on the focal plane of the device, resulting in smart image sensors, capable of achieving better processing features in handling massive image data. The objective of this thesis is to present a new architecture of smart CMOS image sensor for real time 3D measurement using the sheet-beam projection methods based on active triangulation. Proposing the vision sensor as an ensemble of linear sensor arrays, all working in parallel and processing the entire image in slices, the complexity of the image-processing task shifts from O (N 2 ) to O (N). Inherent also in the design is the high level of parallelism to achieve massive parallel processing at high frame rate, required in 3D computation problems. This work demonstrates a prototype of the smart linear sensor incorporating full testability features to test and debug both at device and system levels. The salient features of this work are the asynchronous position to pulse stream conversion, multiple images binarization, high parallelism and modular architecture resulting in frame rate and sub-pixel resolution suitable for real time 3D measurements

    Coded access optical sensor (CAOS) imager and applications

    Get PDF
    Starting in 2001, we proposed and extensively demonstrated (using a DMD: Digital Micromirror Device) an agile pixel Spatial Light Modulator (SLM)-based optical imager based on single pixel photo-detection (also called a single pixel camera) that is suited for operations with both coherent and incoherent light across broad spectral bands. This imager design operates with the agile pixels programmed in a limited SNR operations starring time-multiplexed mode where acquisition of image irradiance (i.e., intensity) data is done one agile pixel at a time across the SLM plane where the incident image radiation is present. Motivated by modern day advances in RF wireless, optical wired communications and electronic signal processing technologies and using our prior-art SLM-based optical imager design, described using a surprisingly simple approach is a new imager design called Coded Access Optical Sensor (CAOS) that has the ability to alleviate some of the key prior imager fundamental limitations. The agile pixel in the CAOS imager can operate in different time-frequency coding modes like Frequency Division Multiple Access (FDMA), Code-Division Multiple Access (CDMA), and Time Division Multiple Access (TDMA). Data from a first CAOS camera demonstration is described along with novel designs of CAOS-based optical instruments for various applications

    Optical Camera Communications: Principles, Modulations, Potential and Challenges

    Get PDF
    Optical wireless communications (OWC) are emerging as cost-effective and practical solutions to the congested radio frequency-based wireless technologies. As part of OWC, optical camera communications (OCC) have become very attractive, considering recent developments in cameras and the use of fitted cameras in smart devices. OCC together with visible light communications (VLC) is considered within the framework of the IEEE 802.15.7m standardization. OCCs based on both organic and inorganic light sources as well as cameras are being considered for low-rate transmissions and localization in indoor as well as outdoor short-range applications and within the framework of the IEEE 802.15.7m standardization together with VLC. This paper introduces the underlying principles of OCC and gives a comprehensive overview of this emerging technology with recent standardization activities in OCC. It also outlines the key technical issues such as mobility, coverage, interference, performance enhancement, etc. Future research directions and open issues are also presented

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Single-chip CMOS tracking image sensor for a complex target

    Get PDF

    Detector Improvements and Optimization to Advance Gravitational-wave Astronomy

    Get PDF
    The thesis covers a range of topics relevant to the current and future gravitational-wave facilities. After the last science observing run, O3, that ended in March 2020, the aLIGO and VIRGO gravitational-wave detectors are undergoing upgrades to improve their sensitivity. My thesis focuses on the work done at the LIGO Hanford Observatory to facilitate these upgrade activities. I worked to develop two novel technologies with applications to gravitational-wave detectors. First, I developed a high-bandwidth, low-noise, flexure-based piezo-deformable mirror for active mode-matching. Mode-matching losses limit improvements from squeezing as they distort the ground state of the squeezed beam. For broadband sensitivity improvements from frequency-dependent squeezing, it is critical to ensure low mode-mismatch losses. These piezo-deformable mirrors are being installed at the aLIGO facilities. Second, I worked to develop and test a high-resolution wavefront sensor that employs a time-of-flight sensor. By achieving phase-locking between the demodulation signal for the time-of-flight sensor and the incident modulated laser beam, this camera is capable of sensing higher-order mode distortions of the incident beam. Cosmic Explorer is a proposed next-generation gravitational-wave observatory in the United States that is planned to be operational by the mid-2030s. Cosmic Explorer along with Einstein Telescope will form a network of next-generation gravitational-wave detectors. I propose the science-goal-focused tunable design of the Cosmic Explorer detectors that allow for the possibility to tune with sensitivity at low, mid, and high frequencies. These tuning options give Cosmic Explorer the flexibility to target a diverse set of science goals with the same detector infrastructure. The technological challenges to achieving these tunable configurations are presented. I find that a 40 km Cosmic Explorer detector outperforms a 20 km in all key science goals other than access to post-merger physics. This suggests that Cosmic Explorer should include at least one 40 km facility. I also explore the detection prospects of core-collapse supernovae with the third-generation facilities -- Cosmic Explorer and Einstein Telescope. I find that the weak gravitational-wave signature from core-collapse supernovae limits the likely sources within our galaxy. This corresponds to a low event rate of two per century
    • …
    corecore