1,414 research outputs found
Satellite Navigation for the Age of Autonomy
Global Navigation Satellite Systems (GNSS) brought navigation to the masses.
Coupled with smartphones, the blue dot in the palm of our hands has forever
changed the way we interact with the world. Looking forward, cyber-physical
systems such as self-driving cars and aerial mobility are pushing the limits of
what localization technologies including GNSS can provide. This autonomous
revolution requires a solution that supports safety-critical operation,
centimeter positioning, and cyber-security for millions of users. To meet these
demands, we propose a navigation service from Low Earth Orbiting (LEO)
satellites which deliver precision in-part through faster motion, higher power
signals for added robustness to interference, constellation autonomous
integrity monitoring for integrity, and encryption / authentication for
resistance to spoofing attacks. This paradigm is enabled by the 'New Space'
movement, where highly capable satellites and components are now built on
assembly lines and launch costs have decreased by more than tenfold. Such a
ubiquitous positioning service enables a consistent and secure standard where
trustworthy information can be validated and shared, extending the electronic
horizon from sensor line of sight to an entire city. This enables the
situational awareness needed for true safe operation to support autonomy at
scale.Comment: 11 pages, 8 figures, 2020 IEEE/ION Position, Location and Navigation
Symposium (PLANS
FieldSAFE: Dataset for Obstacle Detection in Agriculture
In this paper, we present a novel multi-modal dataset for obstacle detection
in agriculture. The dataset comprises approximately 2 hours of raw sensor data
from a tractor-mounted sensor system in a grass mowing scenario in Denmark,
October 2016. Sensing modalities include stereo camera, thermal camera, web
camera, 360-degree camera, lidar, and radar, while precise localization is
available from fused IMU and GNSS. Both static and moving obstacles are present
including humans, mannequin dolls, rocks, barrels, buildings, vehicles, and
vegetation. All obstacles have ground truth object labels and geographic
coordinates.Comment: Submitted to special issue of MDPI Sensors: Sensors in Agricultur
People tracking by cooperative fusion of RADAR and camera sensors
Accurate 3D tracking of objects from monocular camera poses challenges due to the loss of depth during projection. Although ranging by RADAR has proven effective in highway environments, people tracking remains beyond the capability of single sensor systems. In this paper, we propose a cooperative RADAR-camera fusion method for people tracking on the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections from the camera onto the RADAR Range-Azimuth data. Peaks in the joint likelihood, representing candidate targets, are fed into a Particle Filter tracker. Depending on the association outcome, particles are updated using the associated detections (Tracking by Detection), or by sampling the raw likelihood itself (Tracking Before Detection). Utilizing the raw likelihood data has the advantage that lost targets are continuously tracked even if the camera or RADAR signal is below the detection threshold. We show that in single target, uncluttered environments, the proposed method entirely outperforms camera-only tracking. Experiments in a real-world urban environment also confirm that the cooperative fusion tracker produces significantly better estimates, even in difficult and ambiguous situations
On the Importance of Quantifying Visibility for Autonomous Vehicles under Extreme Precipitation
In the context of autonomous driving, vehicles are inherently bound to
encounter more extreme weather during which public safety must be ensured. As
climate is quickly changing, the frequency of heavy snowstorms is expected to
increase and become a major threat to safe navigation. While there is much
literature aiming to improve navigation resiliency to winter conditions, there
is a lack of standard metrics to quantify the loss of visibility of lidar
sensors related to precipitation. This chapter proposes a novel metric to
quantify the lidar visibility loss in real time, relying on the notion of
visibility from the meteorology research field. We evaluate this metric on the
Canadian Adverse Driving Conditions (CADC) dataset, correlate it with the
performance of a state-of-the-art lidar-based localization algorithm, and
evaluate the benefit of filtering point clouds before the localization process.
We show that the Iterative Closest Point (ICP) algorithm is surprisingly robust
against snowfalls, but abrupt events, such as snow gusts, can greatly hinder
its accuracy. We discuss such events and demonstrate the need for better
datasets focusing on these extreme events to quantify their effect.Comment: Submitted to Intelligent Vehicles and Transportation Volume 3 - De
Gruyte
- …