4.7 Article

Exploring spatial and channel contribution for object based image retrieval

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 186, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2019.104955

Keywords

Object retrieval; Spatial and channel contribution; Aggregate; Global representation vector

Funding

  1. NSFC, China [61732008, 61772407, 1531141]
  2. National Key R&D Program of China [2017YFF0107700]
  3. World-Class Universities (Disciplines), China
  4. Characteristic Development Guidance Funds for the Central Universities, China [PY3A022]

Ask authors/readers for more resources

With the rapid development of deep learning methods, researchers have gradually shifted the research focus from hand-crafted features to deep features in the field of the content-based image retrieval (CBIR). A great deal of attention has been paid to aggregate the extracted features from the convolutional layer in the deep convolutional neural network (CNN) into a global representation vector for CBIR. In this paper, we propose a simple but effective method which called Strong-Response-Stack-Contribution (SRSC) to generate the global representation vector for object retrieval. As we know, for object retrieval, when using CNN to extract features, what we want is to extract features in the region of interest (ROI). So we explored spatial and channel contribution to help us focus more on ROI and make the global image representation vector more representative. The process of the approach SRSC is to first generate spatial contribution according to the degree of channel response intensity. Then, we generate channel contribution by joining the sparsity information and the element-value information together. Finally, the global representation vector is generated according to spatial and channel contribution to perform image retrieval. Experiments on Oxford and Paris buildings datasets show the effectiveness of the proposed approach. (C) 2019 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available