Connected and cooperative driving requires precise calibration of the
roadside infrastructure for having a reliable perception system. To solve this
requirement in an automated manner, we present a robust extrinsic calibration
method for automated geo-referenced camera calibration. Our method requires a
calibration vehicle equipped with a combined GNSS/RTK receiver and an inertial
measurement unit (IMU) for self-localization. In order to remove any
requirements for the target's appearance and the local traffic conditions, we
propose a novel approach using hypothesis filtering. Our method does not
require any human interaction with the information recorded by both the
infrastructure and the vehicle. Furthermore, we do not limit road access for
other road users during calibration. We demonstrate the feasibility and
accuracy of our approach by evaluating our approach on synthetic datasets as
well as a real-world connected intersection, and deploying the calibration on
real infrastructure. Our source code is publicly available.Comment: 7 pages, 3 figures, accepted for presentation at the 34th IEEE
Intelligent Vehicles Symposium (IV 2023), June 4 - June 7, 2023, Anchorage,
Alaska, United States of Americ