4.7 Article

Domain Invariant and Agnostic Adaptation

期刊

KNOWLEDGE-BASED SYSTEMS
卷 227, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2021.107192

关键词

Domain adaptation; KL divergence; Distribution matching; Riemannian manifold

资金

  1. National Natural Science Foundation of China [62002212]
  2. Shantou University, China [NTF20007]

向作者/读者索取更多资源

Domain adaptation focuses on matching distributions and learning general feature representation, with DIAA solution matching and aligning distributions through feature transformation and comparing disparities under KL divergence.
Domain adaptation addresses the prediction problem in which the source and target data are sampled from different but related probability distributions. The key problem here lies in properly matching the distributions and learning general feature representation for training the prediction model. In this article, we introduce a Domain Invariant and Agnostic Adaptation (DIAA) solution, which matches the source and target joint distributions, and simultaneously aligns the feature and domain label joint distribution to its marginal product. In particular, DIAA matches and aligns the distributions via a feature transformation, and compares the two kinds of distribution disparities uniformly under the Kullback-Leibler (KL) divergence. To approximate the two corresponding KL divergences from observed samples, we derive a linear-regression-like technique that fits linear models to different ratio functions under the quadratic loss. With the estimated KL divergences, learning the DIAA feature transformation is formulated as solving a Grassmannian minimization problem. Experiments on text and image classification tasks with varied nature demonstrate the success of our approach. (C) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据