3.8 Proceedings Paper

VIPS: Real-Time Perception Fusion for Infrastructure-Assisted Autonomous Driving

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3495243.3560539

Keywords

Perception Fusion; Vehicle Mobility; Infrastructure-Assisted Autonomous Driving; Vehicle-Infrastructure Information Fusion

Funding

  1. Hong Kong Innovation and Technology Commission [PiH/124/22]
  2. Centre for Perceptual and Interactive Intelligence (CPII) [EW01610, RP4-3]

Ask authors/readers for more resources

This paper introduces a infrastructure-assisted autonomous driving system called VIPS, which achieves real-time perception fusion to improve the safety performance of driving vehicles. The system utilizes efficient matching of graph structures and tracked motion trajectories to maintain spatial and temporal consistency, and experimental results demonstrate its advantages in perception range and accuracy.
Infrastructure-assisted autonomous driving is an emerging paradigm that expects to significantly improve the driving safety of autonomous vehicles. The key enabling technology for this vision is to fuse LiDAR results from the roadside infrastructure and the vehicle to improve the vehicle's perception in real time. In this work, we propose VIPS, a novel lightweight system that can achieve decimeter-level and real-time (up to 100 ms) perception fusion between driving vehicles and roadside infrastructure. The key idea of VIPS is to exploit highly efficient matching of graph structures that encode objects' lean representations as well as their relationships, such as locations, semantics, sizes, and spatial distribution. Moreover, by leveraging the tracked motion trajectories, VIPS can maintain the spatial and temporal consistency of the scene, which effectively mitigates the impact of asynchronous data frames and unpredictable communication/compute delays. We implement VIPS end-to-end based on a campus smart lamppost testbed. To evaluate the performance of VIPS under diverse situations, we also collect two new multi-view point cloud datasets using the smart lamppost testbed and an autonomous driving simulator, respectively. Experiment results show that VIPS can extend the vehicle's perception range by 140% within 58 ms on average, and delivers a 4x improvement in perception fusion accuracy and 47x data transmission saving over existing approaches. A video demo of VIPS based on the lamppost dataset is available at https://youtu.be/zW4oi_EWOu0.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available