4.7 Article

A Kernel Classification Framework for Metric Learning

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2014.2361142

Keywords

Kernel method; metric learning; nearest neighbor (NN); polynomial kernel; support vector machine (SVM)

Funding

  1. National Natural Science Foundation of China [61271093, 61001037]
  2. Hong Kong Scholar Program
  3. Research Grants Council, Hong Kong [PolyU 5313/12E]
  4. Ministry of Education for New Century Excellent Talents [NCET-12-0150]

Ask authors/readers for more resources

Learning a distance metric from the given training samples plays a crucial role in many machine learning tasks, and various models and optimization algorithms have been proposed in the past decade. In this paper, we generalize several state-of-the-art metric learning methods, such as large margin nearest neighbor (LNINN) and information theoretic metric learning (ITNII,), into a kernel classification framework. First, doublets and triplets are constructed from the training samples, and a family of degree-2 polynomial kernel functions is proposed for pairs of doublets or triplets. Then, a kernel classification framework is established to generalize many popular metric learning methods such as LNINN and ITML. The proposed framework can also suggest new metric learning methods, which can he efficiently implemented, interestingly, using the standard support vector machine (SVNI) solvers. Two novel metric learning methods, namely, doublet-SVM and triplet-SVM, are then developed under the proposed framework. Experimental results show that doublet-SVM and triplet-SVM achieve competitive classification accuracies with state-of-the-art metric learning methods but with significantly less training time.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available