4.6 Article

DT-LET: Deep transfer learning by exploring where to transfer

Journal

NEUROCOMPUTING
Volume 390, Issue -, Pages 99-107

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2020.01.042

Keywords

Transfer learning; Where to transfer; Deep learning

Funding

  1. National Natural Science Foundation of China [U1864204, 61773316]
  2. Project of Special Zone for National Defense Science and Technology Innovation
  3. NSERC

Ask authors/readers for more resources

Previous transfer learning methods based on deep network assume the knowledge should be transferred between the same hidden layers of the source domain and the target domains. This assumption doesn't always hold true, especially when the data from the two domains are heterogeneous with different resolutions. In such case, the most suitable numbers of layers for the source domain data and the target domain data would differ. As a result, the high level knowledge from the source domain would be transferred to the wrong layer of target domain. Based on this observation, where to transfer proposed in this paper might be a novel research area. We propose a new mathematic model named DT-LET to solve this heterogeneous transfer learning problem. In order to select the best matching of layers to transfer knowledge, we define specific loss function to estimate the corresponding relationship between high-level features of data in the source domain and the target domain. To verify this proposed cross-layer model, experiments for two cross-domain recognition/classification tasks are conducted, and the achieved superior results demonstrate the necessity of layer correspondence searching. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available