MBA-VO: Motion Blur Aware Visual Odometry for Improved Accuracy

MBA-VO: Motion Blur Aware Visual Odometry
This paper introduces MBA-VO, a novel visual odometry pipeline designed to address the significant challenge posed by motion blur in low-light conditions. Motion blur, often exacerbated by longer exposure times required in dim environments, can degrade the performance of traditional visual odometry systems. MBA-VO tackles this by explicitly modeling and estimating the camera's local trajectory within the exposure time, enabling active compensation for motion blur.
Key Contributions:
- Novel Hybrid Visual Odometry Pipeline: MBA-VO employs a hybrid approach that combines direct methods with explicit motion blur modeling.
- Motion Blur Compensation: The pipeline actively compensates for motion blur by estimating the camera's trajectory during the exposure period.
- New Benchmarking Dataset: A novel dataset specifically designed for evaluating motion blur-aware visual odometry has been developed.
Problem Addressed:
Motion blur is a critical issue for visual odometry, particularly in scenarios with low light where longer exposure times are necessary. This leads to blurred images, even with moderate camera movement, significantly impacting the accuracy and robustness of odometry estimation.
Methodology:
MBA-VO's core innovation lies in its ability to model the image formation process, including the camera's motion during exposure. By estimating this local trajectory, the system can effectively mitigate the negative effects of motion blur. This direct approach allows for a more accurate representation of the scene and camera motion.
Experimental Results:
Experiments demonstrate that MBA-VO achieves improved robustness compared to methods that do not account for motion blur. Crucially, it maintains accuracy comparable to systems operating with images free from motion blur. The developed benchmarking dataset further validates these findings, providing a standardized way to assess performance in challenging visual conditions.
Research Areas and Labs:
This research falls under the umbrella of Computer Vision and is associated with the Spatial AI Lab – Zurich. The work contributes to advancements in robotics, autonomous systems, and computer vision techniques for challenging environments.
Authors and Affiliations:
The paper is authored by Peidong Liu, Xingxing Zuo, Viktor Larsson, and Marc Pollefeys. Peidong Liu and Viktor Larsson are affiliated with ETH Zurich, while Marc Pollefeys is associated with Microsoft Research.
Publication Details:
- Conference: International Conference on Computer Vision (ICCV) 2021
- Date: October 2021
- Download BibTex: Link to BibTex
- Publication Link: Link to Publication
Follow and Connect:
The research is promoted through various social media channels, including X (formerly Twitter), Facebook, LinkedIn, YouTube, and Instagram, encouraging engagement and dissemination of findings within the AI and computer vision communities.
Original article available at: https://www.microsoft.com/en-us/research/publication/mba-vo-motion-blur-aware-visual-odometry/