3 research outputs found

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Analysis of encoding degradation in spiking sensors due to spike delay variation

    Full text link
    Spiking sensors such as the silicon retina and cochlea encode analog signals into massively parallel asynchronous spike train output where the information is contained in the precise spike timing. The variation of the spike timing that arises from spike transmission degrades signal encoding quality. Using the signal-to-distortion ratio (SDR) metric with nonlinear spike train decoding based on frame theory, two particular sources of delay variation including comparison delay TDC and queueing delay TDQ are evaluated on two encoding mechanisms which have been used for implementations of silicon array spiking sensors: asynchronous delta modulation and self-timed reset. As specific examples, TDC is obtained from a 2T current-mode comparator, and TDQ is obtained from an M/D/1 queue for 1-D sensors like the silicon cochlea and an MX/D/1 queue for 2-D sensors like the silicon retina. Quantitative relations between the SDR and the circuit and system parameters of spiking sensors are established. The analysis method presented in this work will be useful for future specifications-guided designs of spiking sensors

    Analysis of Encoding Degradation in Spiking Sensors Due to Spike Delay Variation

    No full text
    corecore