This work proposes a fusion of inertial measurement units (IMUs) and a visual tracking system on an embedded device. The sensor-to-sensor calibration and the pose estimation are both achieved through an unscented Kalman filter (UKF). Two approaches for a UKF-based pose estimation are presented: The first uses the estimated pose of the visual SLAM system as measurement input for the UKF; The second modifies the motion model of the visual tracking system. Our results show that IMUs increase tracking accuracy even if the visual SLAM system is untouched, while requiring little computational power. Furthermore, an accelerometer-based map scale estimation is presented and discussed.
«