4.5 Article

Tissue Segmentation in Nasopharyngeal CT Images Using Two-Stage Learning

期刊

CMC-COMPUTERS MATERIALS & CONTINUA
卷 65, 期 2, 页码 1771-1780

出版社

TECH SCIENCE PRESS
DOI: 10.32604/cmc.2020.010069

关键词

Tissue segmentation; deep learning; two-stage network; convolutional neural network

资金

  1. National Natural Science Foundation of China [61602066]
  2. Scientific Research Foundation of CUIT [KYTZ201608]
  3. major Project of Education Department in Sichuan [17ZA0063, 2017JQ0030]
  4. Sichuan international science and technology cooperation and exchange research program [2016HH0018]

向作者/读者索取更多资源

Tissue segmentation is a fundamental and important task in nasopharyngeal images analysis. However, it is a challenging task to accurately and quickly segment various tissues in the nasopharynx region due to the small difference in gray value between tissues in the nasopharyngeal image and the complexity of the tissue structure. In this paper, we propose a novel tissue segmentation approach based on a two-stage learning framework and U-Net. In the proposed methodology, the network consists of two segmentation modules. The first module performs rough segmentation and the second module performs accurate segmentation. Considering the training time and the limitation of computing resources, the structure of the second module is simpler and the number of network layers is less. In addition, our segmentation module is based on U-Net and incorporates a skip structure, which can make full use of the original features of the data and avoid feature loss. We evaluated our proposed method on the nasopharyngeal dataset provided by West China Hospital of Sichuan University. The experimental results show that the proposed method is superior to many standard segmentation structures and the recently proposed nasopharyngeal tissue segmentation method, and can be easily generalized across different tissue types in various organs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据