We have recently published the source code of our visual odometry algorithm, which supports stereo and RGB-D camera systems.
Source code is available at: (github link). It is compatible with ROS (Robot Operating System).
Our journal publication was recently accepted, it is available at Sensors. This is the paper’s abstract:
Lightweight Visual Odometry for Autonomous Mobile Robots
Vision-based motion estimation is an effective means for mobile robot localization and is often used in conjunction with other sensors for navigation and path planning. This paper presents a low-overhead real-time ego-motion estimation (visual odometry) system based on either a stereo or RGB-D sensor. The algorithm’s accuracy outperforms typical frame-to-frame approaches by maintaining a limited local map, while requiring significantly less memory and computational power in contrast to using global maps common in full visual SLAM methods. The algorithm is evaluated on common publicly available datasets that span different use-cases and performance is compared to other comparable open-source systems in terms of accuracy, frame rate and memory requirements. This paper accompanies the release of the source code as a modular software package for the robotics community compatible with the Robot Operating System (ROS).
This video highlights the algorithm in action: