3.8 Proceedings Paper

Joint Object Detection and Multi-Object Tracking with Graph Neural Networks

Publisher

IEEE
DOI: 10.1109/ICRA48506.2021.9561110

Keywords

-

Ask authors/readers for more resources

Object detection and data association are critical components in multi-object tracking systems. Recent works have shown that simultaneously optimizing detection and data association modules under a joint MOT framework can lead to improved performance. This study proposes a new instance of joint MOT approach based on Graph Neural Networks, which can model relations between variable-sized objects in both spatial and temporal domains, leading to state-of-the-art performance for both detection and MOT tasks.
Object detection and data association are critical components in multi-object tracking (MOT) systems. Despite the fact that the two components are dependent on each other, prior works often design detection and data association modules separately which are trained with separate objectives. As a result, one cannot back-propagate the gradients and optimize the entire MOT system, which leads to sub-optimal performance. To address this issue, recent works simultaneously optimize detection and data association modules under a joint MOT framework, which has shown improved performance in both modules. In this work, we propose a new instance of joint MOT approach based on Graph Neural Networks (GNNs). The key idea is that GNNs can model relations between variable-sized objects in both the spatial and temporal domains, which is essential for learning discriminative features for detection and data association. Through extensive experiments on the MOT15/16/17/20 datasets, we demonstrate the effectiveness of our GNN-based joint MOT approach and show state-of-the-art performance for both detection and MOT tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available