4.7 Article

Compact Hash Code Learning With Binary Deep Neural Network

Journal

IEEE TRANSACTIONS ON MULTIMEDIA
Volume 22, Issue 4, Pages 992-1004

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TMM.2019.2935680

Keywords

Binary constraint optimization; image search; learning to hash

Funding

  1. National Research Foundation Singapore under its AI Singapore Programme Award [AISG-100E-2018-005]
  2. ST Electronics
  3. National Research Foundation (NRF), Prime Minister's Office, Singapore under Corporate Laboratory at University Scheme (Programme Title: STEE Infosec - SUTD Corporate Laboratory)

Ask authors/readers for more resources

Learning compact binary codes for image retrieval problem using deep neural networks has recently attracted increasing attention. However, training deep hashing networks is challenging due to the binary constraints on the hash codes. In this paper, we propose deep network models and learning algorithms for learning binary hash codes given image representations under both unsupervised and supervised manners. The novelty of our network design is that we constrain one hidden layer to directly output the binary codes. This design has overcome a challenging problem in some previous works: optimizing non-smooth objective functions because of binarization. In addition, we propose to incorporate independence and balance properties in the direct and strict forms into the learning schemes. We also include a similarity preserving property in our objective functions. The resulting optimizations involving these binary, independence, and balance constraints are difficult to solve. To tackle this difficulty, we propose to learn the networks with alternating optimization and careful relaxation. Furthermore, by leveraging the powerful capacity of convolutional neural networks, we propose an end-to-end architecture that jointly learns to extract visual features and produce binary hash codes. Experimental results for the benchmark datasets show that the proposed methods compare favorably or outperform the state of the art.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available