Journal
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Volume 40, Issue 12, Pages 3034-3044Publisher
IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2018.2789887
Keywords
Binary codes; unsupervised deep hashing; image retrieval
Funding
- National Natural Science Foundation of China [61502081, 61572108, 61632007]
- Fundamental Research Funds for the Central Universities [ZYGX2015kyqd017]
- Australian Research Council [FT130101530]
- Australian Research Council [FT130101530] Funding Source: Australian Research Council
Ask authors/readers for more resources
Recent vision and learning studies show that learning compact hash codes can facilitate massive data processing with significantly reduced storage and computation. Particularly, learning deep hash functions has greatly improved the retrieval performance, typically under the semantic supervision. In contrast, current unsupervised deep hashing algorithms can hardly achieve satisfactory performance due to either the relaxed optimization or absence of similarity-sensitive objective. In this work, we propose a simple yet effective unsupervised hashing framework, named Similarity-Adaptive Deep Hashing (SADH), which alternatingly proceeds over three training modules: deep hash model training, similarity graph updating and binary code optimization. The key difference from the widely-used two-step hashing method is that the output representations of the learned deep model help update the similarity graph matrix, which is then used to improve the subsequent code optimization. In addition, for producing high-quality binary codes, we devise an effective discrete optimization algorithm which can directly handle the binary constraints with a general hashing loss. Extensive experiments validate the efficacy of SADH, which consistently outperforms the state-of-the-arts by large gaps.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available