RGB-Laser Odometry Under Interval Uncertainty for Guaranteed Localization

Abstract

To navigate, robots have to localize themselves in the environment. Since GPS might not be available, robots need to estimate their egomotion gradually using different sensors such as cameras, laser scanners and/or inertial measurement units (IMUs). In the past, we developed a probabilistic approach to estimate the robot’s odometry using a monocular camera and a depth sensor while simultaneously estimating the sensors' clock offsets. Nevertheless, our approach and most other approaches focus on computing a point-valued position for the robot only while neglecting the uncertainty of the pose estimation. However, due to the imperfections of every sensor, this point position might be erroneous. Further difficulties are the nonlinearity of the problem, which can lead to further deviations from the true position, and outliers, which can affect the result of these approaches since they tend to compute a solution that satisfies all observations. To overcome these issues, we propose an interval-based approach. Our method fuses camera, laser scanner and IMU information while taking the sensors' uncertainties into account to compute intervals for the robot’s pose in 3D. Similar work was done by Kenmogne et al., but they compute their robot’s position relative to known landmarks, which we assume to be not available. Bethencourt and Jaulin solve a similiar problem to ours, but apply it to 3D reconstruction instead of localization.

Date
Jul 26, 2018
Location
University of Rostock, Rostock, Germany
Click on the PDF button above to view the slides of this talk.
Raphael Voges
Raphael Voges
Doctor in Robotics

My research interests include error modeling, sensor fusion, SLAM, state estimation and AI in the context of autonomous driving.

Related