4.7 Article

ATFE-Net: Axial Transformer and Feature Enhancement-based CNN for ultrasound breast mass segmentation

Journal

COMPUTERS IN BIOLOGY AND MEDICINE
Volume 153, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.compbiomed.2022.106533

Keywords

Breast mass segmentation; Convolutional neural network; Axial transformer; Self-attention; Feature enhancement

Ask authors/readers for more resources

This paper proposes an axial Transformer and feature enhancement-based CNN (ATFE-Net) for ultrasound breast mass segmentation. The ATFE-Net utilizes an axial Transformer module and a Transformer-based feature enhancement module to capture long-range dependencies and enhance feature representation. Experimental results demonstrate that the ATFE-Net outperforms several state-of-the-art methods on breast ultrasound datasets.
Breast mass is one of the main clinical symptoms of breast cancer. Recently, many CNN-based methods for breast mass segmentation have been proposed. However, these methods have difficulties in capturing long-range dependencies, causing poor segmentation of large-scale breast masses. In this paper, we propose an axial Transformer and feature enhancement-based CNN (ATFE-Net) for ultrasound breast mass segmentation. Specially, an axial Transformer (Axial-Trans) module and a Transformer-based feature enhancement (Trans-FE) module are proposed to capture long-range dependencies. Axial-Trans module only calculates self-attention in width and height directions of input feature maps, which reduces the complexity of self-attention significantly from O(n2) to O(n). In addition, Trans-FE module can enhance feature representation by capturing dependencies between different feature layers, since deeper feature layers have richer semantic information and shallower feature layers have more detailed information. The experimental results show that our ATFE-Net achieved better performance than several state-of-the-art methods on two publicly available breast ultrasound datasets, with Dice coefficient of 82.46% for BUSI and 86.78% for UDIAT, respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available