4.7 Article

Domain Generalization by Joint-Product Distribution Alignment

期刊

PATTERN RECOGNITION
卷 134, 期 -, 页码 -

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.109086

关键词

Distribution alignment; Distribution divergence; Domain generalization; Feature transformation

向作者/读者索取更多资源

In this work, we propose a solution to the problem of domain generalization for classification, which involves learning a classification model on a set of source domains and generalizing it to a target domain. Our approach aligns a joint distribution and a product distribution using a neural transformation and minimizes the Relative Chi-Square (RCS) divergence between the two distributions to learn the transformation. We demonstrate the effectiveness of our solution through comparisons with state-of-the-art methods on various image classification datasets.
In this work, we address the problem of domain generalization for classification, where the goal is to learn a classification model on a set of source domains and generalize it to a target domain. The source and target domains are different, which weakens the generalization ability of the learned model. To tackle the domain difference, we propose to align a joint distribution and a product distribution using a neural transformation, and minimize the Relative Chi-Square (RCS) divergence between the two distributions to learn that transformation. In this manner, we conveniently achieve the alignment of multiple domains in the neural transformation space. Specifically, we show that the RCS divergence can be explicitly estimated as the maximal value of a quadratic function, which allows us to perform joint-product distribution alignment by minimizing the divergence estimate. We demonstrate the effectiveness of our solution through comparison with the state-of-the-art methods on several image classification datasets. (c) 2022 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据