10,393 research outputs found
A high dynamic range image sensor with linear response based on asynchronous event detection
This paper investigates the potential of an image sensor that combines event-based asynchronous outputs with conventional integration of photocurrents. Pixels voltages can be read out following a traditional approach with a source follower and analog-to-digital converter. Furthermore, pixels have circuitry to implement Pulse Density Modulation (PDM) sending out pulses with a frequency that is proportional to the photocurrent. Both read-out approaches operate simultaneously. Their information is combined to render high dynamic range images. In this paper, we explain the new vision sensor concept and we develop a theoretical analysis of the expected performance in standard AMS 0.18mm HV technology. Moreover, we provide a description of the vision sensor architecture and its main blocksMinisterio de Economía, Industria y Competitividad TEC2012-38921-C02-02European Union IPT-2011-1625-430000Office of Naval Research (USA) N00014-14-1-035
A high dynamic range image sensor with linear response based on asynchronous event detection
This paper investigates the potential of an image
sensor that combines event-based asynchronous outputs with
conventional integration of photocurrents. Pixels voltages can
be read out following a traditional approach with a source
follower and analog-to-digital converter. Furthermore, pixels
have circuitry to implement Pulse Density Modulation (PDM)
sending out pulses with a frequency that is proportional to the
photocurrent. Both read-out approaches operate simultaneously.
Their information is combined to render high dynamic range
images. In this paper, we explain the new vision sensor concept
and we develop a theoretical analysis of the expected performance
in standard AMS 0.18mm HV technology. Moreover, we provide
a description of the vision sensor architecture and its main blocksPeer reviewe
Event-based Vision: A Survey
Event cameras are bio-inspired sensors that differ from conventional frame
cameras: Instead of capturing images at a fixed rate, they asynchronously
measure per-pixel brightness changes, and output a stream of events that encode
the time, location and sign of the brightness changes. Event cameras offer
attractive properties compared to traditional cameras: high temporal resolution
(in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low
power consumption, and high pixel bandwidth (on the order of kHz) resulting in
reduced motion blur. Hence, event cameras have a large potential for robotics
and computer vision in challenging scenarios for traditional cameras, such as
low-latency, high speed, and high dynamic range. However, novel methods are
required to process the unconventional output of these sensors in order to
unlock their potential. This paper provides a comprehensive overview of the
emerging field of event-based vision, with a focus on the applications and the
algorithms developed to unlock the outstanding properties of event cameras. We
present event cameras from their working principle, the actual sensors that are
available and the tasks that they have been used for, from low-level vision
(feature detection and tracking, optic flow, etc.) to high-level vision
(reconstruction, segmentation, recognition). We also discuss the techniques
developed to process events, including learning-based techniques, as well as
specialized processors for these novel sensors, such as spiking neural
networks. Additionally, we highlight the challenges that remain to be tackled
and the opportunities that lie ahead in the search for a more efficient,
bio-inspired way for machines to perceive and interact with the world
A Bio-Inspired Vision Sensor With Dual Operation and Readout Modes
This paper presents a novel event-based vision sensor with two operation modes: intensity mode and spatial contrast detection. They can be combined with two different readout approaches: pulse density modulation and time-to-first spike. The sensor is conceived to be a node of an smart camera network made up of several independent an autonomous nodes that send information to a central one. The user can toggle the operation and the readout modes with two control bits. The sensor has low latency (below 1 ms under average illumination conditions), low power consumption (19 mA), and reduced data flow, when detecting spatial contrast. A new approach to compute the spatial contrast based on inter-pixel event communication less prone to mismatch effects than diffusive networks is proposed. The sensor was fabricated in the standard AMS4M2P 0.35-um process. A detailed system-level description and experimental results are provided.Office of Naval Research (USA) N00014-14-1-0355Ministerio de Economía y Competitividad TEC2012- 38921-C02-02, P12-TIC-2338, IPT-2011-1625-43000
On the Analysis and Detection of Flames Withan Asynchronous Spiking Image Sensor
We have investigated the capabilities of a customasynchronous spiking image sensor operating in the NearInfrared band to study flame radiation emissions, monitortheir transient activity, and detect their presence. Asynchronoussensors have inherent capabilities, i.e., good temporal resolution,high dynamic range, and low data redundancy. This makesthem competitive against infrared (IR) cameras and CMOSframe-based NIR imagers. In this paper, we analyze, discuss,and compare the experimental data measured with our sensoragainst results obtained with conventional devices. A set ofmeasurements have been taken to study the flame emissionlevels and their transient variations. Moreover, a flame detectionalgorithm, adapted to our sensor asynchronous outputs, has beendeveloped. Results show that asynchronous spiking sensors havean excellent potential for flame analysis and monitoring.Universidad de Cádiz PR2016-07Ministerio de Economía y Competitividad TEC2015-66878-C3-1-RJunta de Andalucía TIC 2012-2338Office of Naval Research (USA) N00014141035
A micropower centroiding vision processor
Published versio
CED: Color Event Camera Dataset
Event cameras are novel, bio-inspired visual sensors, whose pixels output
asynchronous and independent timestamped spikes at local intensity changes,
called 'events'. Event cameras offer advantages over conventional frame-based
cameras in terms of latency, high dynamic range (HDR) and temporal resolution.
Until recently, event cameras have been limited to outputting events in the
intensity channel, however, recent advances have resulted in the development of
color event cameras, such as the Color-DAVIS346. In this work, we present and
release the first Color Event Camera Dataset (CED), containing 50 minutes of
footage with both color frames and events. CED features a wide variety of
indoor and outdoor scenes, which we hope will help drive forward event-based
vision research. We also present an extension of the event camera simulator
ESIM that enables simulation of color events. Finally, we present an evaluation
of three state-of-the-art image reconstruction methods that can be used to
convert the Color-DAVIS346 into a continuous-time, HDR, color video camera to
visualise the event stream, and for use in downstream vision applications.Comment: Conference on Computer Vision and Pattern Recognition Workshop
- …