#Visual Inertial Odometry #IMM Filter #Robotics #Navigation #RealSense T265 #Jetson Xavier NX
Robust Navigation System Based on IMM Filtering Using Multiple Tracking Cameras
Introduction
In the evolving landscape of autonomous robotics, robust navigation systems are essential for reliable operation in complex environments. This project explores a novel approach to visual inertial navigation using multiple tracking cameras, addressing the limitations of single-camera systems in texture-poor or poorly illuminated scenes.
Methodology
The proposed system integrates multiple Intel RealSense T265 tracking cameras facing different directions. It employs an Interacting Multiple Model (IMM) filtering framework to dynamically weight and fuse odometry estimations from each camera based on their accuracy. An external sensor provides measurements to support the IMM filter, enhancing robustness against sensor failures.
Results
Real-world experiments were conducted using a handheld device equipped with two T265 cameras and an Nvidia Jetson Xavier NX. The system demonstrated reliable performance in both normal and failure-induced scenarios. When one camera was covered, the IMM filter successfully shifted reliance to the other camera, maintaining accurate navigation.
Conclusion
This project presents a robust and real-time navigation system that effectively integrates multiple tracking cameras using IMM filtering. It enhances the reliability of visual inertial navigation systems in challenging environments and demonstrates superior performance compared to single-camera setups. Future work will focus on extending this approach to open-source VINS algorithms for broader applicability.
Publication
Citation
@inproceedings{kuruppu2024robust,
title={Robust Navigation Based on an Interacting Multiple-Model Filtering Framework Using Multiple Tracking Cameras},
author={Kuruppu Arachchige, Sasanka and Lee, Kyuman},
booktitle={AIAA SCITECH 2024 Forum},
pages={1175},
year={2024}
}