4.6 Article

Embodied Active Domain Adaptation for Semantic Segmentation via Informative Path Planning

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 7, Issue 4, Pages 8691-8698

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2022.3188901

Keywords

Perception-action coupling; integrated planning and learning; object detection; segmentation and categorization

Categories

Funding

  1. Microsoft Swiss Joint Research Center
  2. HILTI Group

Ask authors/readers for more resources

This study presents an embodied agent with an adaptive semantic segmentation network that can autonomously adapt to new indoor environments. By collecting images of the new environment and utilizing self-supervised domain adaptation, the agent can quickly and safely gather relevant data. Experiments demonstrate that our method achieves faster adaptation and higher performance compared to an exploration objective.
This work presents an embodied agent that can adapt its semantic segmentation network to new indoor environments in a fully autonomous way. Because semantic segmentation networks fail to generalize well to unseen environments, the agent collects images of the new environment which are then used for self-supervised domain adaptation. We formulate this as an informative path planning problem, and present a novel information gain that leverages uncertainty extracted from the semantic model to safely collect relevant data. As domain adaptation progresses, these uncertainties change over time and the rapid learning feedback of our system drives the agent to collect different data. Experiments show that our method adapts to new environments faster and with higher final performance compared to an exploration objective, and can successfully be deployed to real-world environments on physical robots.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available