13 research outputs found

    Radar-only ego-motion estimation in difficult settings via graph matching

    Full text link
    Radar detects stable, long-range objects under variable weather and lighting conditions, making it a reliable and versatile sensor well suited for ego-motion estimation. In this work, we propose a radar-only odometry pipeline that is highly robust to radar artifacts (e.g., speckle noise and false positives) and requires only one input parameter. We demonstrate its ability to adapt across diverse settings, from urban UK to off-road Iceland, achieving a scan matching accuracy of approximately 5.20 cm and 0.0929 deg when using GPS as ground truth (compared to visual odometry's 5.77 cm and 0.1032 deg). We present algorithms for keypoint extraction and data association, framing the latter as a graph matching optimization problem, and provide an in-depth system analysis.Comment: 6 content pages, 1 page of references, 5 figures, 4 tables, 2019 IEEE International Conference on Robotics and Automation (ICRA

    Extrinsic Calibration of 2D Millimetre-Wavelength Radar Pairs Using Ego-Velocity Estimates

    Full text link
    Correct radar data fusion depends on knowledge of the spatial transform between sensor pairs. Current methods for determining this transform operate by aligning identifiable features in different radar scans, or by relying on measurements from another, more accurate sensor. Feature-based alignment requires the sensors to have overlapping fields of view or necessitates the construction of an environment map. Several existing techniques require bespoke retroreflective radar targets. These requirements limit both where and how calibration can be performed. In this paper, we take a different approach: instead of attempting to track targets or features, we rely on ego-velocity estimates from each radar to perform calibration. Our method enables calibration of a subset of the transform parameters, including the yaw and the axis of translation between the radar pair, without the need for a shared field of view or for specialized targets. In general, the yaw and the axis of translation are the most important parameters for data fusion, the most likely to vary over time, and the most difficult to calibrate manually. We formulate calibration as a batch optimization problem, show that the radar-radar system is identifiable, and specify the platform excitation requirements. Through simulation studies and real-world experiments, we establish that our method is more reliable and accurate than state-of-the-art methods. Finally, we demonstrate that the full rigid body transform can be recovered if relatively coarse information about the platform rotation rate is available.Comment: Accepted to the 2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2023), Seattle, Washington, USA, June 27- July 1, 202

    Doppler-only Single-scan 3D Vehicle Odometry

    Full text link
    We present a novel 3D odometry method that recovers the full motion of a vehicle only from a Doppler-capable range sensor. It leverages the radial velocities measured from the scene, estimating the sensor's velocity from a single scan. The vehicle's 3D motion, defined by its linear and angular velocities, is calculated taking into consideration its kinematic model which provides a constraint between the velocity measured at the sensor frame and the vehicle frame. Experiments carried out prove the viability of our single-sensor method compared to mounting an additional IMU. Our method provides the translation of the sensor, which cannot be reliably determined from an IMU, as well as its rotation. Its short-term accuracy and fast operation (~5ms) make it a proper candidate to supply the initialization to more complex localization algorithms or mapping pipelines. Not only does it reduce the error of the mapper, but it does so at a comparable level of accuracy as an IMU would. All without the need to mount and calibrate an extra sensor on the vehicle.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    Advancements in Radar Odometry

    Full text link
    Radar odometry estimation has emerged as a critical technique in the field of autonomous navigation, providing robust and reliable motion estimation under various environmental conditions. Despite its potential, the complex nature of radar signals and the inherent challenges associated with processing these signals have limited the widespread adoption of this technology. This paper aims to address these challenges by proposing novel improvements to an existing method for radar odometry estimation, designed to enhance accuracy and reliability in diverse scenarios. Our pipeline consists of filtering, motion compensation, oriented surface points computation, smoothing, one-to-many radar scan registration, and pose refinement. The developed method enforces local understanding of the scene, by adding additional information through smoothing techniques, and alignment of consecutive scans, as a refinement posterior to the one-to-many registration. We present an in-depth investigation of the contribution of each improvement to the localization accuracy, and we benchmark our system on the sequences of the main datasets for radar understanding, i.e., the Oxford Radar RobotCar, MulRan, and Boreas datasets. The proposed pipeline is able to achieve superior results, on all scenarios considered and under harsh environmental constraints

    Real-Time Pose Graph SLAM based on Radar

    Get PDF
    This work presents a real-time pose graph based Simultaneous Localization and Mapping (SLAM) system for automotive Radar. The algorithm constructs a map from Radar detections using the Iterative Closest Point (ICP) method to match consecutive scans obtained from a single, front-facing Radar sensor. The algorithm is evaluated on a range of real-world datasets and shows mean translational errors as low as 0.62 m and demonstrates robustness on long tracks. Using a single Radar, our proposed system achieves state-of-the-art performance when compared to other Radar-based SLAM algorithms that use multiple, higher-resolution Radars

    Static Background Removal in Vehicular Radar: Filtering in Azimuth-Elevation-Doppler Domain

    Full text link
    A significant challenge in autonomous driving systems lies in image understanding within complex environments, particularly dense traffic scenarios. An effective solution to this challenge involves removing the background or static objects from the scene, so as to enhance the detection of moving targets as key component of improving overall system performance. In this paper, we present an efficient algorithm for background removal in automotive radar applications, specifically utilizing a frequency-modulated continuous wave (FMCW) radar. Our proposed algorithm follows a three-step approach, encompassing radar signal preprocessing, three-dimensional (3D) ego-motion estimation, and notch filter-based background removal in the azimuth-elevation-Doppler domain. To begin, we model the received signal of the FMCW multiple-input multiple-output (MIMO) radar and develop a signal processing framework for extracting four-dimensional (4D) point clouds. Subsequently, we introduce a robust 3D ego-motion estimation algorithm that accurately estimates radar ego-motion speed, accounting for Doppler ambiguity, by processing the point clouds. Additionally, our algorithm leverages the relationship between Doppler velocity, azimuth angle, elevation angle, and radar ego-motion speed to identify the spectrum belonging to background clutter. Subsequently, we employ notch filters to effectively filter out the background clutter. The performance of our algorithm is evaluated using both simulated data and extensive experiments with real-world data. The results demonstrate its effectiveness in efficiently removing background clutter and enhacing perception within complex environments. By offering a fast and computationally efficient solution, our approach effectively addresses challenges posed by non-homogeneous environments and real-time processing requirements

    Motion Estimation and Compensation in Automotive MIMO SAR

    Get PDF
    With the advent of self-driving vehicles, autonomous driving systems will have to rely on a vast number of heterogeneous sensors to perform dynamic perception of the surrounding environment. Synthetic Aperture Radar (SAR) systems increase the resolution of conventional mass-market radars by exploiting the vehicle's ego-motion, requiring a very accurate knowledge of the trajectory, usually not compatible with automotive-grade navigation systems. In this regard, this paper deals with the analysis, estimation and compensation of trajectory estimation errors in automotive SAR systems, proposing a complete residual motion estimation and compensation workflow. We start by defining the geometry of the acquisition and the basic processing steps of Multiple-Input Multiple-Output (MIMO) SAR systems. Then, we analytically derive the effects of typical motion errors in automotive SAR imaging. Based on the derived models, the procedure is detailed, outlining the guidelines for its practical implementation. We show the effectiveness of the proposed technique by means of experimental data gathered by a 77 GHz radar mounted in a forward looking configuration.Comment: 14 page

    4DEgo: ego-velocity estimation from high-resolution radar data

    Get PDF
    Automotive radars allow for perception of the environment in adverse visibility and weather conditions. New high-resolution sensors have demonstrated potential for tasks beyond obstacle detection and velocity adjustment, such as mapping or target tracking. This paper proposes an end-to-end method for ego-velocity estimation based on radar scan registration. Our architecture includes a 3D convolution over all three channels of the heatmap, capturing features associated with motion, and an attention mechanism for selecting significant features for regression. To the best of our knowledge, this is the first work utilizing the full 3D radar heatmap for ego-velocity estimation. We verify the efficacy of our approach using the publicly available ColoRadar dataset and study the effect of architectural choices and distributional shifts on performance
    corecore