The autonomous driving of robots requires precise and reliable positioning information with low-cost sensors. In this paper, we propose a tightly coupled sensor fusion of multiple complementary sensors including GNSS-RTK, INS, odometry, Local Positioning System (LPS) and Visual Positioning. The focus of this paper is on the integration of LPS and vision since the coupling of GNSS-RTK, INS and odometry is already state of the art. We include the positions of the LPS anchors and the bearing vectors and distances from the robot’s camera towards the patch features as state vectors in our Kalman filter, and show the achievable positioning accuracies.
«
The autonomous driving of robots requires precise and reliable positioning information with low-cost sensors. In this paper, we propose a tightly coupled sensor fusion of multiple complementary sensors including GNSS-RTK, INS, odometry, Local Positioning System (LPS) and Visual Positioning. The focus of this paper is on the integration of LPS and vision since the coupling of GNSS-RTK, INS and odometry is already state of the art. We include the positions of the LPS anchors and the bearing vector...
»