Journal
IEEE TRANSACTIONS ON IMAGE PROCESSING
Volume 24, Issue 3, Pages 967-979Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2015.2389624
Keywords
Large scale image retrieval; scalar quantization; binary SIFT; visual matching; feature filtering
Funding
- 973 Program [2015CB351803]
- NSFC [61325009, 61390514, 61472378, 61472116]
- Fundamental Research Funds for the Central Universities [WK2100060014, WK2100060011]
- University of Science and Technology of China [KY2100000048]
- Intel Collaborative Research Institute on Mobile Networking and Computing Project
- Program for New Century Excellent Talents in University [NCET-13-0764]
- Texas State University Research Enhancement Program
- Army Research Office (ARO) [W911NF-12-1-0057]
- National Science Foundation [CRI 1305302]
- ARO [W911NF-12-1-0057]
- Faculty Research Awards through NEC Laboratories of America
- National Natural Science Foundation of China (NSFC) [61429201]
- Direct For Computer & Info Scie & Enginr
- Division Of Computer and Network Systems [1305302] Funding Source: National Science Foundation
Ask authors/readers for more resources
Bag-of-Words (BoWs) model based on Scale Invariant Feature Transform (SIFT) has been widely used in large-scale image retrieval applications. Feature quantization by vector quantization plays a crucial role in BoW model, which generates visual words from the high-dimensional SIFT features, so as to adapt to the inverted file structure for the scalable retrieval. Traditional feature quantization approaches suffer several issues, such as necessity of visual codebook training, limited reliability, and update inefficiency. To avoid the above problems, in this paper, a novel feature quantization scheme is proposed to efficiently quantize each SIFT descriptor to a descriptive and discriminative bit-vector, which is called binary SIFT (BSIFT). Our quantizer is independent of image collections. In addition, by taking the first 32 bits out from BSIFT as code word, the generated BSIFT naturally lends itself to adapt to the classic inverted file structure for image indexing. Moreover, the quantization error is reduced by feature filtering, code word expansion, and query sensitive mask shielding. Without any explicit codebook for quantization, our approach can be readily applied in image search in some resource-limited scenarios. We evaluate the proposed algorithm for large scale image search on two public image data sets. Experimental results demonstrate the index efficiency and retrieval accuracy of our approach.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available