4.6 Article

Ground-Aware Monocular 3D Object Detection for Autonomous Driving

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 6, Issue 2, Pages 919-926

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2021.3052442

Keywords

Three-dimensional displays; Cameras; Object detection; Two dimensional displays; Feature extraction; Convolution; Neural networks; Automation technologies for smart cities; deep learning for visual perception; object detection; segmentation and categorization

Categories

Ask authors/readers for more resources

This research presents an innovative neural network module for 3D object detection, utilizing ground plane cues and application-specific priors effectively. The proposed networks achieve state-of-the-art performances on the KITTI benchmarks for 3D object detection and depth prediction.
Estimating the 3D position and orientation of objects in the environment with a single RGB camera is a critical and challenging task for low-cost urban autonomous driving and mobile robots. Most of the existing algorithms are based on the geometric constraints in 2D-3D correspondence, which stems from generic 6D object pose estimation. We first identify how the ground plane provides additional clues in depth reasoning in 3D detection in driving scenes. Based on this observation, we then improve the processing of 3D anchors and introduce a novel neural network module to fully utilize such application-specific priors in the framework of deep learning. Finally, we introduce an efficient neural network embedded with the proposed module for 3D object detection. We further verify the power of the proposed module with a neural network designed for monocular depth prediction. The two proposed networks achieve state-of-the-art performances on the KITTI 3D object detection and depth prediction benchmarks, respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available