4.7 Article

CABNet: Category Attention Block for Imbalanced Diabetic Retinopathy Grading

Journal

IEEE TRANSACTIONS ON MEDICAL IMAGING
Volume 40, Issue 1, Pages 143-153

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TMI.2020.3023463

Keywords

Lesions; Task analysis; Feature extraction; Diabetes; Machine learning; Image segmentation; Training; Diabetic retinopathy grading; attention mechanism; category attention block (CAB); global attention block (GAB)

Funding

  1. National Natural Science Foundation [61872200]
  2. Natural Science Foundation of Tianjin [19JCZDJC31600, 18YFYZCG00060]
  3. Open Project Fund of the State Key Laboratory of Computer Architecture, Institute of Computing Technology, Chinese Academy of Sciences [CARCH201905]

Ask authors/readers for more resources

The research suggests that attention modules effectively address the challenges of imbalanced data distribution and difficulty in identifying small lesions in diabetic retinopathy (DR) grading. By introducing category attention blocks and global attention blocks, the study has achieved significant results in DR grading.
Diabetic Retinopathy (DR) grading is challenging due to the presence of intra-class variations, small lesions and imbalanced data distributions. The key for solving fine-grained DR grading is to find more discriminative features corresponding to subtle visual differences, such as microaneurysms, hemorrhages and soft exudates. However, small lesions are quite difficult to identify using traditional convolutional neural networks (CNNs), and an imbalanced DR data distribution will cause the model to pay too much attention to DR grades with more samples, greatly affecting the final grading performance. In this article, we focus on developing an attention module to address these issues. Specifically, for imbalanced DR data distributions, we propose a novel Category Attention Block (CAB), which explores more discriminative region-wise features for each DR grade and treats each category equally. In order to capture more detailed small lesion information, we also propose the Global Attention Block (GAB), which can exploit detailed and class-agnostic global attention feature maps for fundus images. By aggregating the attention blocks with a backbone network, the CABNet is constructed for DR grading. The attention blocks can be applied to a wide range of backbone networks and trained efficiently in an end-to-end manner. Comprehensive experiments are conducted on three publicly available datasets, showing that CABNet produces significant performance improvements for existing state-of-the-art deep architectures with few additional parameters and achieves the state-of-the-art results for DR grading. Code and models will be available at https://github.com/he2016012996/CABnet.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available