Metric scale and 6dof pose estimation using a color camera and distance sensors

Download
2021-2-26
Ölmez, Burhan
Monocular color cameras are widely used for 6DoF pose estimation and sparse creation of 3D point cloud of the environment over decades with SfM, VO, and V-SLAM algorithms. In this thesis, a novel algorithm is presented to estimate the metric scale information of a monocular visual odometry algorithm using a distance sensor. This method uses a state-of-the-art visual odometry algorithm Semi-Direct Visual Odometry (SVO) [1] for obtaining sparse 3D point cloud and then matches these points with the measurements obtained from the distance sensor to estimate the metric scale. Moreover, the scale parameter is modeled as a Gaussian random variable and updated with the calculated scale using a Kalman filter for a more stable result. Additionally, multiple distance sensors are added to estimate the scale more accurately. It is observed that the scale accuracy can significantly be improved in the case of multiple sensors. As another novel approach, the estimation of the roll and pitch angles for the camera platform is considered. This is achieved with respect to the ground plane using three distance sensors placed with a specific geometry and their corresponding 3D point cloud matches. This angle information does not drift in time thanks to direct metric measurements from distance sensors. Finally, with four special distance senv sors, which can leave marks on the environment, direct 6DoF pose estimation with respect to the pattern is found. A novel heuristic pattern and pattern recognition algorithm are proposed. Several simulations are performed on a MAV equipped with a camera and distance sensors in an advanced SITL environment and the performance of the proposed approaches are shown to be better than the previous works in different scenarios.
Citation Formats
B. Ölmez, “Metric scale and 6dof pose estimation using a color camera and distance sensors,” M.S. - Master of Science, Middle East Technical University, 2021.