This work proposes to use passive acoustic perception as an additional
sensing modality for intelligent vehicles. We demonstrate that approaching
vehicles behind blind corners can be detected by sound before such vehicles
enter in line-of-sight. We have equipped a research vehicle with a roof-mounted
microphone array, and show on data collected with this sensor setup that wall
reflections provide information on the presence and direction of occluded
approaching vehicles. A novel method is presented to classify if and from what
direction a vehicle is approaching before it is visible, using as input
Direction-of-Arrival features that can be efficiently computed from the
streaming microphone array data. Since the local geometry around the
ego-vehicle affects the perceived patterns, we systematically study several
environment types, and investigate generalization across these environments.
With a static ego-vehicle, an accuracy of 0.92 is achieved on the hidden
vehicle classification task. Compared to a state-of-the-art visual detector,
Faster R-CNN, our pipeline achieves the same accuracy more than one second
ahead, providing crucial reaction time for the situations we study. While the
ego-vehicle is driving, we demonstrate positive results on acoustic detection,
still achieving an accuracy of 0.84 within one environment type. We further
study failure cases across environments to identify future research directions.Comment: Accepted to IEEE Robotics & Automation Letters (2021), DOI:
10.1109/LRA.2021.3062254. Code, Data & Video:
https://github.com/tudelft-iv/occluded_vehicle_acoustic_detectio