4.6 Article

HAU-Net: Hybrid CNN-transformer for breast ultrasound image segmentation

Journal

BIOMEDICAL SIGNAL PROCESSING AND CONTROL
Volume 87, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.bspc.2023.105427

Keywords

Convolutional Neural Network; Transformer; Breast lesions segmentation; Ultrasound image

Ask authors/readers for more resources

In this paper, the authors propose a hybrid CNN-transformer framework called HAU-Net for breast lesion segmentation. By combining the long-distance dependence of transformers and the local detail representation of CNNs, this framework achieves better performance in segmenting challenging breast ultrasound images.
Breast cancer is a significant health concern that remains one of the leading causes of mortality in women worldwide. Convolutional Neural Networks (CNNs) have been shown to be effective in ultrasound breast image segmentation. Yet, because of the lack of long-distance dependence, the segmentation performance of CNNs is limited in addressing challenges typical of segmentation of ultrasound breast lesions, such as similar intensity distributions, the presence of irregular objects, and blurred boundaries. In order to overcome these issues, several studies have combined transformers and CNNs, to compensate for the shortcomings of CNNs with the ability of transformers to exploit long-distance dependence. Most of these studies limited themselves to rigidly plug transformer blocks into the CNN, lacking consistency in the process of feature extraction and therefore leading to poor performances in segmenting challenging medical images. In this paper, we propose HAU-Net(hierarchical attention-guided U-Net), a hybrid CNN-transformer framework that benefits from both the long-range dependency of transformers and the local detail representation of CNNs. To incorporate global context information, we introduce a L-G transformer block nested into the skip connections of the U shape architecture network. In addition, to further improve the segmentation performance, we added a cross attention block (CAB) module on the decoder side to allow different layers to interact. Extensive experimental results on three public datasets indicate that the proposed HAU-Net can achieve better performance than other state-of-the-art methods for breast lesions segmentation, with Dice coefficient of 83.11% for BUSI, 88.73% for UDIAT, and 89.48% for BLUI respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available