4.7 Article

Quadruplet Network With One-Shot Learning for Fast Visual Object Tracking

Journal

IEEE TRANSACTIONS ON IMAGE PROCESSING
Volume 28, Issue 7, Pages 3516-3527

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2019.2898567

Keywords

Quadruplet deep network; visual object tracking; Siamese deep network

Funding

  1. Beijing Natural Science Foundation [4182056]
  2. Key Research and Development Program of Zhejiang Province [2018C03055]
  3. National Natural Science Foundation of China [61732015]
  4. Australian Research Council [DP150104645]
  5. Fok Ying-Tong Education Foundation for Young Teachers
  6. Specialized Fund for Joint Building Program of the Beijing Municipal Education Commission

Ask authors/readers for more resources

In the same vein of discriminative one-shot learning, Siamese networks allow recognizing an object from a single exemplar with the same class label. However, they do not take advantage of the underlying structure of the data and the relationship among the multitude of samples as they only rely on the pairs of instances for training. In this paper, we propose a new quadruplet deep network to examine the potential connections among the training instances, aiming to achieve a more powerful representation. We design a shared network with four branches that receive a multi-tuple of instances as inputs and are connected by a novel loss function consisting of pair loss and triplet loss. According to the similarity metric, we select the most similar and the most dissimilar instances as the positive and negative inputs of triplet loss from each multi-tuple. We show that this scheme improves the training performance. Furthermore, we introduce a new weight layer to automatically select suitable combination weights, which will avoid the conflict between triplet and pair loss leading to worse performance. We evaluate our quadruplet framework by model-free tracking-by-detection of objects from a single initial exemplar in several visual object tracking benchmarks. Our extensive experimental analysis demonstrates that our tracker achieves superior performance with a real-time processing speed of 78 frames/s. Our source code is available.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available