4.7 Article

Neuron Linear Transformation: Modeling the Domain Shift for Crowd Counting

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2021.3051371

关键词

Task analysis; Neurons; Feature extraction; Data models; Training; Supervised learning; Fuses; Crowd counting; domain adaptation (DA); few-shot learning; neuron linear transformation (NLT)

资金

  1. National Natural Science Foundation of China [U1864204, 61773316, 61632018, 61825603]

向作者/读者索取更多资源

This paper proposes a neuron linear transformation (NLT) method to address domain shift in cross-domain crowd counting, achieving top performance compared with other domain adaptation methods on six real-world datasets. The NLT method exploits domain factor and bias weights to learn the domain shift, demonstrating resilience and effectiveness through ablation studies.
Cross-domain crowd counting (CDCC) is a hot topic due to its importance in public safety. The purpose of CDCC is to alleviate the domain shift between the source and target domain. Recently, typical methods attempt to extract domain-invariant features via image translation and adversarial learning. When it comes to specific tasks, we find that the domain shifts are reflected in model parameters' differences. To describe the domain gap directly at the parameter level, we propose a neuron linear transformation (NLT) method, exploiting domain factor and bias weights to learn the domain shift. Specifically, for a specific neuron of a source model, NLT exploits few labeled target data to learn domain shift parameters. Finally, the target neuron is generated via a linear transformation. Extensive experiments and analysis on six real-world data sets validate that NLT achieves top performance compared with other domain adaptation methods. An ablation study also shows that the NLT is robust and more effective than supervised and fine-tune training. Code is available at https://github.com/taohan10200/NLT.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据