4.6 Article

Learned Camera Gain and Exposure Control for Improved Visual Feature Detection and Matching

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 6, Issue 2, Pages 2028-2035

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2021.3058909

Keywords

Deep learning for visual perception; vision-based navigation; visual learning

Categories

Funding

  1. Canada Research Chairs program

Ask authors/readers for more resources

Successful visual navigation relies on capturing images with sufficient information, and this letter discusses a data-driven approach to adjust camera parameters for better image quality in visual navigation tasks. The trained neural network model can predictively adjust camera settings to maximize matchable features in consecutive images, outperforming competing algorithms in compensating for lighting changes.
Successful visual navigation depends upon capturing images that contain sufficient useful information. In this letter, we explore a data-driven approach to account for environmental lighting changes, improving the quality of images for use in visual odometry (VO) or visual simultaneous localization and mapping (SLAM). We train a deep convolutional neural network model to predictively adjust camera gain and exposure time parameters such that consecutive images contain a maximal number of matchable features. The training process is fully self-supervised: our training signal is derived from an underlying VO or SLAM pipeline and, as a result, the model is optimized to perform well with that specific pipeline. We demonstrate through extensive real-world experiments that our network can anticipate and compensate for dramatic lighting changes (e.g., transitions into and out of road tunnels), maintaining a substantially higher number of inlier feature matches than competing camera parameter control algorithms.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available