4.8 Article

Fusing mmWave Radar With Camera for 3-D Detection in Autonomous Driving

期刊

IEEE INTERNET OF THINGS JOURNAL
卷 9, 期 20, 页码 20408-20421

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JIOT.2022.3175375

关键词

Radar; Three-dimensional displays; Radar imaging; Feature extraction; Radar detection; Cameras; Laser radar; 3-D detection; autonomous driving (AD); camera; millimeter-wave (mmWave) radar; sensors fusion

资金

  1. National Key Research and Development Program [2020YFA0711302]
  2. Beijing Municipal Natural Science Foundation [L192031]
  3. National Natural Science Foundation of China [U21B2014]

向作者/读者索取更多资源

This paper focuses on fusing millimeter-wave radar data with monocular images at the feature level to enhance 3-D detection capability.
Three-dimensional detection is essential for autonomous driving and intelligent transportation system, as it enables vehicles to detect and track surrounding objects. Usually, autonomous vehicles are equipped with multiple sensing modalities to achieve robust and precise detection. This work focuses on fusing millimeter-wave radar data with monocular images, as radar can make up for the lack of explicit depth information. We propose a novel approach that fuses radar data and images at the feature level for 3-D detection. Radar points are first merged into a raw feature map with data set statistics by a novel transformation method. With this transformation, radar features can be extracted by convolutional neural networks and fused with image features. Object properties, including location, dimension, and rotation are regressed from the fused features. In this article, the proposed fusion strategy is implemented with a keypoint-based 3-D detection framework and evaluated on the challenging NuScenes data set. Experimental results suggest that the fusion of radar data promotes 3-D detection capability in public benchmarking.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据