190 research outputs found
RadarSLAM: Radar based Large-Scale SLAM in All Weathers
Numerous Simultaneous Localization and Mapping (SLAM) algorithms have been
presented in last decade using different sensor modalities. However, robust
SLAM in extreme weather conditions is still an open research problem. In this
paper, RadarSLAM, a full radar based graph SLAM system, is proposed for
reliable localization and mapping in large-scale environments. It is composed
of pose tracking, local mapping, loop closure detection and pose graph
optimization, enhanced by novel feature matching and probabilistic point cloud
generation on radar images. Extensive experiments are conducted on a public
radar dataset and several self-collected radar sequences, demonstrating the
state-of-the-art reliability and localization accuracy in various adverse
weather conditions, such as dark night, dense fog and heavy snowfall
RADIATE: A Radar Dataset for Automotive Perception in Bad Weather
Datasets for autonomous cars are essential for the development and
benchmarking of perception systems. However, most existing datasets are
captured with camera and LiDAR sensors in good weather conditions. In this
paper, we present the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to
facilitate research on object detection, tracking and scene understanding using
radar sensing for safe autonomous driving. RADIATE includes 3 hours of
annotated radar images with more than 200K labelled road actors in total, on
average about 4.6 instances per radar image. It covers 8 different categories
of actors in a variety of weather conditions (e.g., sun, night, rain, fog and
snow) and driving scenarios (e.g., parked, urban, motorway and suburban),
representing different levels of challenge. To the best of our knowledge, this
is the first public radar dataset which provides high-resolution radar images
on public roads with a large amount of road actors labelled. The data collected
in adverse weather, e.g., fog and snowfall, is unique. Some baseline results of
radar based object detection and recognition are given to show that the use of
radar data is promising for automotive applications in bad weather, where
vision and LiDAR can fail. RADIATE also has stereo images, 32-channel LiDAR and
GPS data, directed at other applications such as sensor fusion, localisation
and mapping. The public dataset can be accessed at
http://pro.hw.ac.uk/radiate/.Comment: Accepted at IEEE International Conference on Robotics and Automation
2021 (ICRA 2021
LiDAR Snowfall Simulation for Robust 3D Object Detection
3D object detection is a central task for applications such as autonomous
driving, in which the system needs to localize and classify surrounding traffic
agents, even in the presence of adverse weather. In this paper, we address the
problem of LiDAR-based 3D object detection under snowfall. Due to the
difficulty of collecting and annotating training data in this setting, we
propose a physically based method to simulate the effect of snowfall on real
clear-weather LiDAR point clouds. Our method samples snow particles in 2D space
for each LiDAR line and uses the induced geometry to modify the measurement for
each LiDAR beam accordingly. Moreover, as snowfall often causes wetness on the
ground, we also simulate ground wetness on LiDAR point clouds. We use our
simulation to generate partially synthetic snowy LiDAR data and leverage these
data for training 3D object detection models that are robust to snowfall. We
conduct an extensive evaluation using several state-of-the-art 3D object
detection methods and show that our simulation consistently yields significant
performance gains on the real snowy STF dataset compared to clear-weather
baselines and competing simulation approaches, while not sacrificing
performance in clear weather. Our code is available at
www.github.com/SysCV/LiDAR_snow_sim.Comment: Oral at CVPR 202
Survey on LiDAR Perception in Adverse Weather Conditions
Autonomous vehicles rely on a variety of sensors to gather information about
their surrounding. The vehicle's behavior is planned based on the environment
perception, making its reliability crucial for safety reasons. The active LiDAR
sensor is able to create an accurate 3D representation of a scene, making it a
valuable addition for environment perception for autonomous vehicles. Due to
light scattering and occlusion, the LiDAR's performance change under adverse
weather conditions like fog, snow or rain. This limitation recently fostered a
large body of research on approaches to alleviate the decrease in perception
performance. In this survey, we gathered, analyzed, and discussed different
aspects on dealing with adverse weather conditions in LiDAR-based environment
perception. We address topics such as the availability of appropriate data, raw
point cloud processing and denoising, robust perception algorithms and sensor
fusion to mitigate adverse weather induced shortcomings. We furthermore
identify the most pressing gaps in the current literature and pinpoint
promising research directions.Comment: published at IEEE IV 202
Energy-based Detection of Adverse Weather Effects in LiDAR Data
Autonomous vehicles rely on LiDAR sensors to perceive the environment.
Adverse weather conditions like rain, snow, and fog negatively affect these
sensors, reducing their reliability by introducing unwanted noise in the
measurements. In this work, we tackle this problem by proposing a novel
approach for detecting adverse weather effects in LiDAR data. We reformulate
this problem as an outlier detection task and use an energy-based framework to
detect outliers in point clouds. More specifically, our method learns to
associate low energy scores with inlier points and high energy scores with
outliers allowing for robust detection of adverse weather effects. In extensive
experiments, we show that our method performs better in adverse weather
detection and has higher robustness to unseen weather effects than previous
state-of-the-art methods. Furthermore, we show how our method can be used to
perform simultaneous outlier detection and semantic segmentation. Finally, to
help expand the research field of LiDAR perception in adverse weather, we
release the SemanticSpray dataset, which contains labeled vehicle spray data in
highway-like scenarios. The dataset is available at
http://dx.doi.org/10.18725/OPARU-48815 .Comment: Accepted for publication in IEEE Robotics and Automation Letters
(RA-L
- …