4.7 Article

Local Semantic Siamese Networks for Fast Tracking

Journal

IEEE TRANSACTIONS ON IMAGE PROCESSING
Volume 29, Issue -, Pages 3351-3364

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2019.2959256

Keywords

Visual object tracking; Siamese deep network; local feature representation

Funding

  1. Beijing Natural Science Foundation [4182056]
  2. Joint Building Program of Beijing Municipal Education Commission

Ask authors/readers for more resources

Learning a powerful feature representation is critical for constructing a robust Siamese tracker. However, most existing Siamese trackers learn the global appearance features of the entire object, which usually suffers from drift problems caused by partial occlusion or non-rigid appearance deformation. In this paper, we propose a new Local Semantic Siamese (LSSiam) network to extract more robust features for solving these drift problems, since the local semantic features contain more fine-grained and partial information. We learn the semantic features during offline training by adding a classification branch into the classical Siamese framework. To further enhance the representation of features, we design a generally focal logistic loss to mine the hard negative samples. During the online tracking, we remove the classification branch and propose an efficient template updating strategy to avoid aggressive computing load. Thus, the proposed tracker can run at a high-speed of 100 Frame-per-Second (FPS) far beyond real-time requirement. Extensive experiments on popular benchmarks demonstrate the proposed LSSiam tracker achieves the state-of-the-art performance with a high-speed. Our source code is available at https://github.com/shenjianbing/LSSiam.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available