Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Robust Multispectral Visual-Inertial Navigation With Visual Odometry Failure Recovery
Date
2022-07-01
Author
Beauvisage, Axel
Ahıska, Kenan
Aouf, Nabil
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
23
views
0
downloads
Cite This
Besides the large amount of information multi-spectral imaging offers, multispectral visual odometry remains overlooked due to the dissimilarity between modalities. In order to tackle the challenging feature matching between multispectral stereo images and to overcome the lack of robust multispectral visual localisation solutions, a novel approach is proposed in this paper. It consists in tracking features in each modality simultaneously, in a monocular manner and then, estimating motion in a windowed bundle adjustment framework and using the geometry of the stereo setup to recover the missing scale. The estimation is robustified by selecting adequate keyframes based on feature parallax and by maximising the mutual information between all the features reprojected in the stereo pair. Furthermore, the proposed multispectral visual odometry solution is integrated in an error-state Kalman filter framework to deal with challenging environments, where the quality of images is reduced. Two measurements models, using absolute and relative camera poses, are presented. The superiority of relative poses is then shown by providing a failure recovery algorithm which relies on inertial data when visual data are not accessible. The algorithm was tested on innovative series of visible-thermal multispectral datasets, acquired from a car with real driving conditions. An overall error of around 2% of the travelled distance was achieved on these datasets.
URI
https://hdl.handle.net/11511/117644
Journal
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
DOI
https://doi.org/10.1109/tits.2021.3090675
Collections
Department of Electrical and Electronics Engineering, Article
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
A. Beauvisage, K. Ahıska, and N. Aouf, “Robust Multispectral Visual-Inertial Navigation With Visual Odometry Failure Recovery,”
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
, vol. 23, no. 7, pp. 9089–9101, 2022, Accessed: 00, 2025. [Online]. Available: https://hdl.handle.net/11511/117644.