4,451 research outputs found

    A factorization approach to inertial affine structure from motion

    Full text link
    We consider the problem of reconstructing a 3-D scene from a moving camera with high frame rate using the affine projection model. This problem is traditionally known as Affine Structure from Motion (Affine SfM), and can be solved using an elegant low-rank factorization formulation. In this paper, we assume that an accelerometer and gyro are rigidly mounted with the camera, so that synchronized linear acceleration and angular velocity measurements are available together with the image measurements. We extend the standard Affine SfM algorithm to integrate these measurements through the use of image derivatives

    A factorization approach to inertial affine structure from motion

    Full text link
    We consider the problem of reconstructing a 3-D scene from a moving camera with high frame rate using the affine projection model. This problem is traditionally known as Affine Structure from Motion (Affine SfM), and can be solved using an elegant low-rank factorization formulation. In this paper, we assume that an accelerometer and gyro are rigidly mounted with the camera, so that synchronized linear acceleration and angular velocity measurements are available together with the image measurements. We extend the standard Affine SfM algorithm to integrate these measurements through the use of image derivatives

    Conservative Estimation of Inertial Sensor Errors Using Allan Variance Data

    Get PDF
    To understand the error sources present in inertial sensors, both the white (time-invariant) and correlated noise sources must be properly characterized. To understand both sources, the standard approach (IEEE standards 647-2006, 952-2020) is to compute the Allan variance of the noise and then use human-based interpretation of linear trends to estimate the separate noise sources present in a sensor. Recent work has sought to overcome the graphical nature and visual-inspection basis of this approach leading to more accurate noise estimates. However, when using noise characterization in a filter, it is important that the noise estimates be not only accurate but also conservative, i.e., that the estimated noise parameters overbound truth. In this paper, we propose a novel method for automatically estimating conservative noise parameters using the Allan variance. Results of using this method to characterize a low-cost MEMS IMU (Analog Devices ADIS16470) are presented, demonstrating the efficacy of the proposed approach

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Autonomous and Resilient Management of All-Source Sensors for Navigation Assurance

    Get PDF
    All-source navigation has become increasingly relevant over the past decade with the development of viable alternative sensor technologies. However, as the number and type of sensors informing a system increases, so does the probability of corrupting the system with sensor modeling errors, signal interference, and undetected faults. Though the latter of these has been extensively researched, the majority of existing approaches have constrained faults to biases, and designed algorithms centered around the assumption of simultaneously redundant, synchronous sensors with valid measurement models, none of which are guaranteed for all-source systems. This research aims to provide all-source multi-sensor resiliency, assurance, and integrity through an autonomous sensor management framework. The proposed framework dynamically places each sensor in an all-source system into one of four modes: monitoring, validation, calibration, and remodeling. Each mode contains specific and novel realtime processes that affect how a navigation system responds to sensor measurements. The monitoring mode is driven by a novel sensor-agnostic fault detection, exclusion, and integrity monitoring method that minimizes the assumptions on the fault type, all-source sensor composition, and the number of faulty sensors. The validation mode provides a novel method for the online validation of sensors which have questionable sensor models, in a fault-agnostic and sensor-agnostic manner, and without compromising the ongoing navigation solution in the process. The remaining two modes, calibration and remodeling, generalize and integrate online calibration and model identification processes to provide autonomous and dynamic estimation of candidate model functions and their parameters, which when paired with the monitoring and validation processes, directly enable resilient, self-correcting, plug-and-play open architecture navigation systems
    • …
    corecore