4.7 Article

Multi-Scale Metric Learning for Few-Shot Learning

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2020.2995754

Keywords

Measurement; Feature extraction; Task analysis; Training; Semantics; Neural networks; Learning systems; Few-shot learning; multi-scale feature maps; metric learning

Funding

  1. National Natural Science Foundation of China [61901376]
  2. Fundamental Research Funds for the Central Universities [G2019KY05301]
  3. Peak Experience Plan in Northwestern Polytechnical University

Ask authors/readers for more resources

This paper proposes a novel few-shot learning method called multi-scale metric learning (MSML) to tackle the classification problem in few-shot learning by extracting multi-scale features and learning multi-scale relationships. The method introduces a feature pyramid structure and a multi-scale relation generation network, and optimizes the deep network with the intra-class and inter-class relation loss, achieving superior performance in experimental results on mini ImageNet and tiered ImageNet.
Few-shot learning in image classification is developed to learn a model that aims to identify unseen classes with only few training samples for each class. Fewer training samples and new tasks of classification make many traditional classification models no longer applicable. In this paper, a novel few-shot learning method named multi-scale metric learning (MSML) is proposed to extract multi-scale features and learn the multi-scale relations between samples for the classification of few-shot learning. In the proposed method, a feature pyramid structure is introduced for multi-scale feature embedding, which aims to combine high-level strong semantic features with low-level but abundant visual features. Then a multi-scale relation generation network (MRGN) is developed for hierarchical metric learning, in which high-level features are corresponding to deeper metric learning while low-level features are corresponding to lighter metric learning. Moreover, a novel loss function named intra-class and inter-class relation loss (IIRL) is proposed to optimize the proposed deep network, which aims to strengthen the correlation between homogeneous groups of samples and weaken the correlation between heterogeneous groups of samples. Experimental results on mini ImageNet and tiered ImageNet demonstrate that the proposed method achieves superior performance in few-shot learning problem.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available