4.6 Article

OverlapTransformer: An Efficient and Yaw-Angle-Invariant Transformer Network for LiDAR-Based Place Recognition

期刊

IEEE ROBOTICS AND AUTOMATION LETTERS
卷 7, 期 3, 页码 6958-6965

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2022.3178797

关键词

SLAM; deep learning methods; data sets for robot learning

类别

资金

  1. HAOMO.AI Technology Co. Ltd.
  2. Chinese Scholarship Committee

向作者/读者索取更多资源

This article addresses the problem of place recognition based on 3D LiDAR scans and proposes a novel lightweight neural network approach. The author achieves fast execution using the range image representation of LiDAR sensors and designs a yaw-angle-invariant network architecture to improve place recognition performance. Experimental results demonstrate that the method performs well across different environments.
Place recognition is an important capability for autonomously navigating vehicles operating in complex environments and under changing conditions. It is a key component for tasks such as loop closing in SLAM or global localization. In this letter, we address the problem of place recognition based on 3D LiDAR scans recorded by an autonomous vehicle. We propose a novel lightweight neural network exploiting the range image representation of LiDAR sensors to achieve fast execution with less than 2 ms per frame. We design a yaw-angle-invariant architecture exploiting a transformer network, which boosts the place recognition performance of our method. We evaluate our approach on the KITTI and Ford Campus datasets. The experimental results show that our method can effectively detect loop closures compared to the state-of-the-art methods and generalizes well across different environments. To evaluate long-term place recognition performance, we provide a novel dataset containing LiDAR sequences recorded by a mobile robot in repetitive places at different times.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据