3.8 Proceedings Paper

Robotic Interestingness via Human-Informed Few-Shot Object Detection

出版社

IEEE
DOI: 10.1109/IROS47612.2022.9981461

关键词

-

资金

  1. ONR [N0014-19-1-2266]
  2. ARL DCIST CRA award [W911NF17-2-0181]

向作者/读者索取更多资源

This research proposes a human-interactive framework for detecting human-informed interesting objects through few-shot online learning. An unsupervised learning algorithm is applied on the unmanned vehicle to recognize interesting scenes, and annotations provided by a human operator are used to learn and detect similar objects.
Interestingness recognition is crucial for decision making in autonomous exploration for mobile robots. Previous methods proposed an unsupervised online learning approach that can adapt to environments and detect interesting scenes quickly, but lack the ability to adapt to human-informed interesting objects. To solve this problem, we introduce a human-interactive framework, AirInteraction, that can detect human-informed objects via few-shot online learning. To reduce the communication bandwidth, we first apply an online unsupervised learning algorithm on the unmanned vehicle for interestingness recognition and then only send the potential interesting scenes to a base-station for human inspection. The human operator is able to draw and provide bounding box annotations for particular interesting objects, which are sent back to the robot to detect similar objects via few-shot learning. Only using few human-labeled examples, the robot can learn novel interesting object categories during the mission and detect interesting scenes that contain the objects. We evaluate our method on various interesting scene recognition datasets. To the best of our knowledge, it is the first human-informed fewshot object detection framework for autonomous exploration.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据