4.8 Article

Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2018.2789887

关键词

Binary codes; unsupervised deep hashing; image retrieval

资金

  1. National Natural Science Foundation of China [61502081, 61572108, 61632007]
  2. Fundamental Research Funds for the Central Universities [ZYGX2015kyqd017]
  3. Australian Research Council [FT130101530]
  4. Australian Research Council [FT130101530] Funding Source: Australian Research Council

向作者/读者索取更多资源

Recent vision and learning studies show that learning compact hash codes can facilitate massive data processing with significantly reduced storage and computation. Particularly, learning deep hash functions has greatly improved the retrieval performance, typically under the semantic supervision. In contrast, current unsupervised deep hashing algorithms can hardly achieve satisfactory performance due to either the relaxed optimization or absence of similarity-sensitive objective. In this work, we propose a simple yet effective unsupervised hashing framework, named Similarity-Adaptive Deep Hashing (SADH), which alternatingly proceeds over three training modules: deep hash model training, similarity graph updating and binary code optimization. The key difference from the widely-used two-step hashing method is that the output representations of the learned deep model help update the similarity graph matrix, which is then used to improve the subsequent code optimization. In addition, for producing high-quality binary codes, we devise an effective discrete optimization algorithm which can directly handle the binary constraints with a general hashing loss. Extensive experiments validate the efficacy of SADH, which consistently outperforms the state-of-the-arts by large gaps.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据