4.7 Article

iSimLoc: Visual Global Localization for Previously Unseen Environments With Simulated Images

Journal

IEEE TRANSACTIONS ON ROBOTICS
Volume 39, Issue 3, Pages 1893-1909

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TRO.2023.3238201

Keywords

Visualization; Feature extraction; Location awareness; Lighting; Cameras; Global Positioning System; Urban areas; Aerial visual terrain navigation; GPS denied localization; hierarchical global relocalization; sim-to-real

Categories

Ask authors/readers for more resources

This article introduces iSimLoc, a learning-based global relocalization approach that can match visual data with significant appearance and viewpoint differences. The method utilizes a place recognition network to match query images to reference images of different stylistic domains and viewpoints. The hierarchical global relocalization module enables fast and accurate pose estimation. iSimLoc achieves high successful retrieval rates and outperforms other methods in terms of inference time.
The camera is an attractive device for use in beyond visual line of sight drone operation since cameras are low in size, weight, power, and cost. However, state-of-the-art visual localization algorithms have trouble matching visual data that have significantly different appearances due to changes in illumination or viewpoint. This article presents iSimLoc, a learning-based global relocalization approach that is robust to appearance and viewpoint differences. The features learned by iSimLoc's place recognition network can be utilized to match query images to reference images of a different stylistic domain and viewpoint. In addition, our hierarchical global relocalization module searches in a coarse-to-fine manner, allowing iSimLoc to perform fast and accurate pose estimation. We evaluate our method on a dataset with appearance variations and a dataset that focuses on demonstrating large-scale matching over a long flight over complex terrain. iSimLoc achieves 88.7% and 83.8% successful retrieval rates on our two datasets, with 1.5 s inference time, compared to 45.8% and 39.7% using the next best method. These results demonstrate robust localization in a range of environments and conditions.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available