4.7 Article

Domain Invariant Transfer Kernel Learning

Journal

IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING
Volume 27, Issue 6, Pages 1519-1532

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2014.2373376

Keywords

Transfer learning; kernel learning; Nystrom method; text mining; image classification; video recognition

Funding

  1. National HeGaoJi Key Project [2010ZX01042-002-002-01]
  2. National Science Fund for Distinguished Young Scholars [F020204]
  3. Tsinghua National Laboratory Fund for Big Data Science and Technology
  4. US National Science Foundation (NSF) [OISE-1129076, CNS-1115234, DBI-0960443]
  5. US Department of Army [W911NF-12-1-0066]
  6. Office Of The Director
  7. Office Of Internatl Science &Engineering [1129076] Funding Source: National Science Foundation

Ask authors/readers for more resources

Domain transfer learning generalizes a learning model across training data and testing data with different distributions. A general principle to tackle this problem is reducing the distribution difference between training data and testing data such that the generalization error can be bounded. Current methods typically model the sample distributions in input feature space, which depends on nonlinear feature mapping to embody the distribution discrepancy. However, this nonlinear feature space may not be optimal for the kernel-based learning machines. To this end, we propose a transfer kernel learning (TKL) approach to learn a domain-invariant kernel by directly matching source and target distributions in the reproducing kernel Hilbert space (RKHS). Specifically, we design a family of spectral kernels by extrapolating target eigensystem on source samples with Mercer's theorem. The spectral kernel minimizing the approximation error to the ground truth kernel is selected to construct domain-invariant kernel machines. Comprehensive experimental evidence on a large number of text categorization, image classification, and video event recognition datasets verifies the effectiveness and efficiency of the proposed TKL approach over several state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available