3.8 Proceedings Paper

I-MuPPET: Interactive Multi-Pigeon Pose Estimation and Tracking

Journal

PATTERN RECOGNITION, DAGM GCPR 2022
Volume 13485, Issue -, Pages 513-528

Publisher

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-16788-1_31

Keywords

Pose estimation; Multi-object tracking; Animals; Applications

Funding

  1. Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy [EXC 2117 -422037984]
  2. Office of Naval Research [N00014-64019-1-2556]
  3. European Union [860949]

Ask authors/readers for more resources

This paper presents I-MuPPET, a system for estimating and tracking 2D keypoints of multiple pigeons. The system achieves fast and accurate results by training a neural network on single pigeons and using a state-of-the-art tracker for multiple pigeon sequences.
Most tracking data encompasses humans, the availability of annotated tracking data for animals is limited, especially for multiple objects. To overcome this obstacle, we present I-MuPPET, a system to estimate and track 2D keypoints of multiple pigeons at interactive speed. We train a Keypoint R-CNN on single pigeons in a fully supervised manner and infer keypoints and bounding boxes of multiple pigeons with that neural network. We use a state of the art tracker to track the individual pigeons in video sequences. I-MuPPET is tested quantitatively on single pigeon motion capture data, and we achieve comparable accuracy to state of the art 2D animal pose estimation methods in terms of Root Mean Square Error (RMSE). Additionally, we test I-MuPPET to estimate and track poses of multiple pigeons in video sequences with up to four pigeons and obtain stable and accurate results with up to 17 fps. To establish a baseline for future research, we perform a detailed quantitative tracking evaluation, which yields encouraging results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available