3.8 Proceedings Paper

Dense Relation Distillation with Context-aware Aggregation for Few-Shot Object Detection

出版社

IEEE COMPUTER SOC
DOI: 10.1109/CVPR46437.2021.01005

关键词

-

资金

  1. National Key R&D Program of China [2017YFB1002804]

向作者/读者索取更多资源

In this work, a Dense Relation Distillation with Context-aware Aggregation (DCNet) is proposed to address the few-shot object detection problem by fully exploiting support features and capturing fine-grained features. The model achieves state-of-the-art results on PASCAL VOC and MS COCO datasets, demonstrating the effectiveness of the proposed approach.
Conventional deep learning based methods for object detection require a large amount of bounding box annotations for training, which is expensive to obtain such high quality annotated data. Few-shot object detection, which learns to adapt to novel classes with only a few annotated examples, is very challenging since the fine-grained feature of novel object can be easily overlooked with only a few data available. In this work, aiming to fully exploit features of annotated novel object and capture fine-grained features of query object, we propose Dense Relation Distillation with Context-aware Aggregation (DCNet) to tackle the few-shot detection problem. Built on the meta-learning based framework, Dense Relation Distillation module targets at fully exploiting support features, where support features and query feature are densely matched, covering all spatial locations in a feed-forward fashion. The abundant usage of the guidance information endows model the capability to handle common challenges such as appearance changes and occlusions. Moreover, to better capture scale-aware features, Context-aware Aggregation module adaptively harnesses features from different scales for a more comprehensive feature representation. Extensive experiments illustrate that our proposed approach achieves state-of-the-art results on PASCAL VOC and MS COCO datasets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据