4.7 Article

Unsupervised domain adaptation for regression using dictionary learning

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 267, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2023.110439

Keywords

Domain adaptation; Transfer learning; Regression; Deep learning; Dictionary learning; Sparse coding

Ask authors/readers for more resources

This paper studies the problem of unsupervised domain adaptation for regression tasks and proposes a new approach based on dictionary learning. Experimental results show that the proposed method outperforms most of state-of-the-art methods on several benchmark datasets, especially when transferring knowledge from synthetic to real domains.
Unsupervised domain adaptation aims to generalize the knowledge learned on a labeled source domain across an unlabeled target domain. Most of existing unsupervised approaches are feature-based methods that seek to find domain invariant features. Despite their wide applications, these approaches proved to have some limitations especially in regression tasks. In this paper, we study the problem of unsupervised domain adaptation for regression tasks. We highlight the obstacles faced in regression compared to a classification task in terms of sensitivity to the scattering of data in feature space. We take this issue and propose a new unsupervised domain adaptation approach based on dictionary learning. We seek to learn a dictionary on source data and follow an optimal direction trajectory to minimize the residue of the reconstruction of the target data with the same dictionary. For stable training of a neural network, we provide a robust implementation of a projected gradient descent dictionary learning framework, which allows to have a backpropagation friendly end-to-end method. Experimental results show that the proposed method outperforms significantly most of state-of-the-art methods on several well-known benchmark datasets, especially when transferring knowledge from synthetic to real domains.(c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available