Set-Membership Extrinsic Calibration of a 3D LiDAR and a Camera


To fuse information from a 3D Light Detection and Ranging (LiDAR) sensor and a camera, the extrinsic transformation between the sensor coordinate systems needs to be known. Therefore, an extrinsic calibration must be performed, which is usually based on features extracted from sensor data. Naturally, sensor errors can affect the feature extraction process, and thus distort the calibration result. Unlike previous works, which do not consider the uncertainties of the sensors, we propose a set-membership approach that takes all sensor errors into account. Since the actual error distribution of off-the-shelf sensors is often unknown, we assume to only know bounds (or intervals) enclosing the sensor errors and accordingly introduce novel error models for both sensors. Next, we introduce interval-based approaches to extract corresponding features from images and point clouds. Due to the unknown but bounded sensor errors, we cannot determine the features exactly, but compute intervals guaranteed to enclose them. Subsequently, these feature intervals enable us to formulate a Constraint Satisfaction Problem (CSP). Finally, the CSP is solved to find a set of solutions that is guaranteed to contain the true solution and simultaneously reflects the accuracy of the calibration. Experiments using simulated and real data validate our approach and show its advantages over existing methods.

Oct 25, 2020
Click on the PDF button above to view the slides of this talk.
Raphael Voges
Raphael Voges
Doctor in Robotics

My research interests include error modeling, sensor fusion, SLAM, state estimation and AI in the context of autonomous driving.