3.8 Proceedings Paper

Low-rank Bilinear Pooling for Fine-Grained Classification

出版社

IEEE
DOI: 10.1109/CVPR.2017.743

关键词

-

资金

  1. NSF [IIS-1618806, IIS-1253538, DBI-1262547]
  2. Direct For Biological Sciences [1262547] Funding Source: National Science Foundation
  3. Direct For Computer & Info Scie & Enginr [1253538] Funding Source: National Science Foundation
  4. Div Of Biological Infrastructure [1262547] Funding Source: National Science Foundation
  5. Div Of Information & Intelligent Systems [1253538] Funding Source: National Science Foundation

向作者/读者索取更多资源

Pooling second-order local feature statistics to form a high-dimensional bilinear feature has been shown to achieve state-of-the-art performance on a variety of fine-grained classification tasks. To address the computational demands of high feature dimensionality, we propose to represent the covariance features as a matrix and apply a low-rank bilinear classifier. The resulting classifier can be evaluated without explicitly computing the bilinear feature map which allows for a large reduction in the compute time as well as decreasing the effective number of parameters to be learned. To further compress the model, we propose a classifier co-decomposition that factorizes the collection of bilinear classifiers into a common factor and compact perclass terms. The co-decomposition idea can be deployed through two convolutional layers and trained in an end-to-end architecture. We suggest a simple yet effective initialization that avoids explicitly first training and factorizing the larger bilinear classifiers. Through extensive experiments, we show that our model achieves state-of-theart performance on several public datasets for fine-grained classification trained with only category labels. Importantly, our final model is an order of magnitude smaller than the recently proposed compact bilinear model [8], and three orders smaller than the standard bilinear CNN model [19].

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据