31 research outputs found

    Optimal biasing and physical limits of DVS event noise

    Get PDF
    Under dim lighting conditions, the output of Dynamic Vision Sensor (DVS) event cameras is strongly affected by noise. Photon and electron shot-noise cause a high rate of non-informative events that reduce Signal to Noise ratio. DVS noise performance depends not only on the scene illumination, but also on the user-controllable biasing of the camera. In this paper, we explore the physical limits of DVS noise, showing that the DVS photoreceptor is limited to a theoretical minimum of 2x photon shot noise, and we discuss how biasing the DVS with high photoreceptor bias and adequate source-follower bias approaches optimal noise performance. We support our conclusions with pixel-level measurements of a DAVIS346 and analysis of a theoretical pixel model

    Shining light on the DVS pixel: A tutorial and discussion about biasing and optimization

    Get PDF
    The operation of the Dynamic Vision Sensor (DVS) event camera is controlled by the user through adjusting different bias parameters. These biases affect the response of the camera by controlling - among other parameters - the bandwidth, sensitivity, and maximum firing rate of the pixels. Besides determining the response of the camera to input signals, biases significantly impact its noise performance. Bias optimization is a multivariate process depending on the task and the scene, to which the user’s knowledge about pixel design and non-idealities can be of great importance.In this paper, we go step-by-step along the signal pathway of the DVS pixel, shining light on its low-level operation and non-idealities, comparing pixel level measurements with array level measurements, and discussing how biasing and illumination affect the pixel’s behavior. With the results and discussion presented, we aim to help DVS users achieve more hardware-aware camera utilization and modelling

    Shining light on the DVS pixel: A tutorial and discussion about biasing and optimization

    Full text link
    The operation of the DVS event camera is controlled by the user through adjusting different bias parameters. These biases affect the response of the camera by controlling - among other parameters - the bandwidth, sensitivity, and maximum firing rate of the pixels. Besides determining the response of the camera to input signals, biases significantly impact its noise performance. Bias optimization is a multivariate process depending on the task and the scene, to which the user's knowledge about pixel design and non-idealities can be of great importance. In this paper, we go step-by-step along the signal pathway of the DVS pixel, shining light on its low-level operation and non-idealities, comparing pixel level measurements with array level measurements, and discussing and how biasing and illumination affect the pixel's behavior. With the results and discussion presented, we aim to help DVS users achieve more hardware-aware camera utilization and modelling.Comment: Accepted at 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW); 4th International Workshop on Event-Based Visio

    Exploiting Alternating DVS Shot Noise Event Pair Statistics to Reduce Background Activity Rates

    Get PDF
    Dynamic Vision Sensors (DVS) record ”events” corresponding to pixel-level brightness changes, resulting in dataefficient representation of a dynamic visual scene. As DVS expand into increasingly diverse applications, non-ideal behaviors in their output under extreme sensing conditions are important to consider. Under low illumination (below ≈10 lux) their output begins to be dominated by shot noise events (SNEs) which increase the data output and obscure true signal. SNE rates can be controlled to some degree by tuning circuit parameters to reduce sensitivity or temporal response bandwidth at the cost of signal loss. Alternatively, an improved understanding of SNE statistics can be leveraged to develop novel techniques for minimizing uninformative sensor output. We first explain a fundamental observation about sequential pairing of opposite polarity SNEs based on pixel circuit logic and validate our theory using DVS recordings and simulations. Finally, we derive a practical result from this new understanding and demonstrate two novel biasing techniques shown to reduce SNEs by 50% and 80% respectively while still retaining sensitivity and/or temporal resolution

    Utility and Feasibility of a Center Surround Event Camera

    Full text link

    Demystifying Event-based Sensor Biasing to Optimize Signal to Noise for Space Domain Awareness

    Get PDF
    Neuromorphic dynamic vision sensors (DVS), often called event-based sensors (EBS), are a novel class of cameras that have recently shown potential to make a significant impact in the SDA community. Their biologically-inspired design simultaneously achieves high temporal resolution, wide dynamic range, low power consumption and sparse data output, making them an ideal fit for space applications. Although initial results for SDA are promising, they typically exhibit elevated noise rates in dim conditions and have thus far failed to outperform conventional cameras in terms of limiting visual magnitude and sensitivity with high telescope scan rates. A hurdle for widespread adoption is a lack of general guidance regarding optimal camera biases (settings) for SDA. Prior studies either serve as proof of concept or focus on algorithm development; however, to date, none have provided detailed guidance on biasing EBS to optimize signal to noise ratio (SNR) for SDA tasks. The goal of this paper is to narrow the knowledge gap between EBS pixel biasing and resulting performance to optimize their capabilities for SDA. To accomplish this, we adopt a bottom-up approach, revisiting the pixel architecture to consider physics-based performance limitations. In an EBS, each pixel responds autonomously, generating "events" in response to local brightness changes within its field of view (FOV), and outputs a sparse representation of the visual scene where each event is encoded by a pixel address (x,y), a microsecond resolution timestamp (t), and a single bit polarity value (p) indicating either an increase or decrease in brightness by a defined threshold. In most camera models, behavior is fine-tuned by adjusting roughly a half-dozen biases, including threshold levels (sensitivity), bandwidth (speed of the front-end photoreceptor), and refractory period (dead-time between events in a given pixel). These parameters make EBS cameras adaptable for varied applications, but many degrees of freedom presents a challenge for optimization. Researchers unfamiliar with the technology can be overwhelmed by the myriad of biasing options and must either rely on a prescribed set of biases or manually adjust them to achieve desired performance; the latter is not typically recommended for non-experts due to 2nd-order effects such as excessive noise rates. Manufacturer default biases are considered optimized for a broad range of applications, but recent studies have demonstrated non-conventional bias techniques can significantly reduce background noise in dim conditions while still retaining signal, suggesting that SDA capabilities could be improved by a more sophisticated biasing strategy. By conducting a detailed study of how sensitivity, response speed, and noise rates scale with varied bias configurations, we aim to approach an optimal SNR bias configuration and demonstrate the maximal capabilities of current generation COTS EBS cameras for SDA. To systematically analyze and benchmark performance against a calibrated and repeatable stimulus, we developed a custom SDA test-bench to simulate stars/satellites as sub-pixel point source targets of variable speed and brightness. The set-up includes an integrating light box to provide a calibrated flat-field illumination source, a custom 170 mm radius anodized aluminum disk with precision drilled holes of diameters ranging from 100 to 250 microns, and a digitally programmable motor capable of precise speed control from ~0.1 to 800 RPM. The disk is backlit by the flat-field illumination source and connected to the motor shaft, and a 7 x 10 cm region is viewed through a Fujinon 1:1.8/7-70mm CS mount lens at a distance of 50 cm. The FOV and zoom are chosen such that the dimension of the largest holes is still sub-pixel in diameter when in focus. Even with the ability to rapidly collect measurements with this setup, the overall parameter space is still too large to fully explore without any a-priori knowledge about how the sensor responds to signal and noise, and how this depends on biases. As a result, we consider fundamental pixel behaviors to devise an efficient test strategy. We first consider strategies to limit noise rates, as these can overwhelm sensor readout when the background is dark. In prior work, this was presumably accomplished by either reducing the bandwidth biases or increasing threshold biases, but these approaches inherently limit signal. Instead of this naive approach, we draw inspiration from two recent studies: the first demonstrated an optimal balance between two bandwidth related biases accessible in some camera prototypes, and the second relies on a key observation about the statistical distribution of noise events to devise two additional biasing techniques to enhance SNR by allowing either lower thresholds or broader bandwidth settings. Using these techniques as a starting point, we examine the performance the DAVIS346 EBS. We first report baseline performance using manufacturer default biases. To quantify performance, we measure sensitivity (dimmest point source detected) and bandwidth (fastest point source detected). Next, we tune bias settings with specific detection goals (i.e. maximum velocity and/or minimum brightness) and analyze the results. Finally, we apply newly developed low-noise bias techniques and attempt to identify general principles that can be applied universally to any EBS camera to improve performance in SDA tasks. This paper provides a baseline for understanding EBS performance characteristics and will significantly lower the entry barrier for new researchers in the field of event-based SDA. More importantly, it adds insight for optimizing EBS behavior for SDA tasks and demonstrates the absolute performance limits of current generation cameras for detecting calibrated point source targets against a dark background. Finally, this study will enable follow-on work including the development of customized denoising, detection, and tracking algorithms that consider signal response and noise statistics as a function of the selected camera and bias configuration

    Event-based camera refractory period characterization and initial clock drift evaluation

    Get PDF
    Event-based camera (EBC) technology provides high-dynamic range operation and shows promise for efficient capture of spatio-temporal information, producing a sparse data stream and enabling consideration of nontraditional data processing solutions (e.g., new algorithms, neuromorphic processors, etc.). Given the fundamental difference in camera architecture, the EBC response and noise behavior differ considerably compared to standard CCD/CMOS framing sensors. These differences necessitate the development of new characterization techniques and sensor models to evaluate hardware performance and elucidate the trade-space between the two camera architectures. Laboratory characterization techniques reported previously include noise level as a function of static scene light level (background activity) and contrast responses referred to as S-curves. Here we present further progress on development of basic characterization methods and test capabilities for commercial-off-the-shelf (COTS) visible EBCs, with a focus on measurement of pixel deadtime (refractory period) including results for the 4th-generation sensor from Prophesee and Sony. Refractory period is empirically determined from analysis of the interspike intervals (ISIs), and results visualized using log-histograms of the minimum per-pixel ISI values for a subset of pixels activated by a controlled dynamic scene. Our tests of the Prophesee gen4 EVKv2 yield refractory period estimates ranging from 6.1 msec to 6.8 ÎŒsec going from the slowest (20) to fastest (100) settings of the relevant bias parameter, bias_refr. We also introduce and demonstrate the concept of pixel bandwidth measurement from data captured while viewing a static scene – based on recording data at a range of refractory period setting and then analyzing noise-event statistics. Finally, we present initial results for estimating and correcting EBC clock drift using a GPS PPS signal to generate special timing events in the event-list data streams generated by the DAVIS346 and DVXplorer EBCs from iniVation

    Falcon Neuro: an event-based sensor on the International Space Station

    Full text link
    We report on the Falcon neuro event-based sensor (EBS) instrument that is designed to acquire data from lightning and sprite phenomena and is currently operating on the International Space Station. The instrument consists of two independent, identical EBS cameras pointing in two fixed directions, toward the nominal forward direction of flight and toward the nominal Nadir direction. The payload employs stock DAVIS 240C focal plane arrays along with custom-built control and readout electronics to remotely interface with the cameras. To predict the sensor’s ability to effectively record sprites and lightning, we explore temporal response characteristics of the DAVIS 240C and use lab measurements along with reported limitations to model the expected response to a characteristic sprite illumination time-series. These simulations indicate that with appropriate camera settings the instrument will be capable of capturing these transient luminous events when they occur. Finally, we include initial results from the instrument, representing the first reported EBS recordings successfully collected aboard a space-based platform and demonstrating proof of concept that a neuromorphic camera is capable of operating in the space environment

    Experimental methods to predict dynamic vision sensor event camera performance

    Full text link
    Dynamic vision sensors (DVS) represent a promising new technology, offering low power consumption, sparse output, high temporal resolution, and wide dynamic range. These features make DVS attractive for new research areas including scientific and space-based applications; however, more precise understanding of how sensor input maps to output under real-world constraints is needed. Often, metrics used to characterize DVS report baseline performance by measuring observable limits but fail to characterize the physical processes at the root of those limits. To address this limitation, we describe step-by-step procedures to measure three important performance parameters: (1) temporal contrast threshold, (2) cutoff frequency, and (3) refractory period. Each procedure draws inspiration from previous work, but links measurements sequentially to infer physical phenomena at the root of measured behavior. Results are reported over a range of brightness levels and user-defined biases. The threshold measurement technique is validated with test-pixel node voltages, and a first-order low-pass approximation of photoreceptor response is shown to predict event cutoff temporal frequency to within 9% accuracy. The proposed method generates lab-measured parameters compatible with the event camera simulator v2e, allowing more accurate generation of synthetic datasets for innovative applications
    corecore