4.7 Article

Domain Generalization by Joint-Product Distribution Alignment

Journal

PATTERN RECOGNITION
Volume 134, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.109086

Keywords

Distribution alignment; Distribution divergence; Domain generalization; Feature transformation

Ask authors/readers for more resources

In this work, we propose a solution to the problem of domain generalization for classification, which involves learning a classification model on a set of source domains and generalizing it to a target domain. Our approach aligns a joint distribution and a product distribution using a neural transformation and minimizes the Relative Chi-Square (RCS) divergence between the two distributions to learn the transformation. We demonstrate the effectiveness of our solution through comparisons with state-of-the-art methods on various image classification datasets.
In this work, we address the problem of domain generalization for classification, where the goal is to learn a classification model on a set of source domains and generalize it to a target domain. The source and target domains are different, which weakens the generalization ability of the learned model. To tackle the domain difference, we propose to align a joint distribution and a product distribution using a neural transformation, and minimize the Relative Chi-Square (RCS) divergence between the two distributions to learn that transformation. In this manner, we conveniently achieve the alignment of multiple domains in the neural transformation space. Specifically, we show that the RCS divergence can be explicitly estimated as the maximal value of a quadratic function, which allows us to perform joint-product distribution alignment by minimizing the divergence estimate. We demonstrate the effectiveness of our solution through comparison with the state-of-the-art methods on several image classification datasets. (c) 2022 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available