4.7 Article

Tackling unsupervised multi-source domain adaptation with optimism and consistency

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 194, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.116486

Keywords

Multi-source domain adaptation; Transfer learning; Adversarial learning; Consistency regularization

Funding

  1. National Funds through the Portuguese funding agency, FCT -Fundacao para a Ciencia e a Tecnologia, Portugal [SFRH/BD/129600/2017]
  2. Fundação para a Ciência e a Tecnologia [SFRH/BD/129600/2017] Funding Source: FCT

Ask authors/readers for more resources

This work presents a novel framework that addresses the problems of adjusting the mixture distribution weights in multi-source domain adaptation and ensuring low error on the target domain. By using a mildly optimistic objective function and consistency regularization on the target samples, it surpasses the current state of the art.
It has been known for a while that the problem of multi-source domain adaptation can be regarded as a single source domain adaptation task where the source domain corresponds to a mixture of the original source domains. Nonetheless, how to adjust the mixture distribution weights remains an open question. Moreover, most existing work on this topic focuses only on minimizing the error on the source domains and achieving domain-invariant representations, which is insufficient to ensure low error on the target domain. In this work, we present a novel framework that addresses both problems and beats the current state of the art by using a mildly optimistic objective function and consistency regularization on the target samples.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available