Journal
IEEE TRANSACTIONS ON IMAGE PROCESSING
Volume 26, Issue 10, Pages 4937-4950Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2017.2725578
Keywords
Metric learning; support vector machine; kernel method; Lagrange duality; alternating minimization
Funding
- National Key R&D Program of China [2017YFC0113000, 2016YFB1001004]
- NSFC [61671182]
- Guangdong Natural Science Foundation [2015A030313129]
- Hong Kong Research Grants Council General Research Fund [PolyU 152212/14E]
Ask authors/readers for more resources
Distance metric learning aims to learn from the given training data a valid distance metric, with which the similarity between data samples can be more effectively evaluated for classification. Metric learning is often formulated as a convex or nonconvex optimization problem, while most existing methods are based on customized optimizers and become inefficient for large scale problems. In this paper, we formulate metric learning as a kernel classification problem with the positive semi-definite constraint, and solve it by iterated training of support vector machines (SVMs). The new formulation is easy to implement and efficient in training with the off-the-shelf SVM solvers. Two novel metric learning models, namely positive-semi-definite constrained metric learning (PCML) and nonnegative-coefficient constrained metric learning (NCML), are developed. Both PCML and NCML can guarantee the global optimality of their solutions. Experiments are conducted on general classification, face verification, and person re-identification to evaluate our methods. Compared with the state-of-the-art approaches, our methods can achieve comparable classification accuracy and are efficient in training.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available