期刊
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE
卷 212, 期 -, 页码 -出版社
ELSEVIER IRELAND LTD
DOI: 10.1016/j.cmpb.2021.106480
关键词
Convolutional neural networks; Image segmentation; Computerized tomography; Feature fusion
类别
资金
- Key Projects of Shanghai Sci-ence and Technology Commission
- Shang-hai Changzheng Hospital [18411952800, 0232-E2-6202-19-022]
- National Natural Science Foundation of China [61802251, 82072228]
- National Key R&D Program of China [2020YFC2008700]
A two-dimensional deep learning segmentation network based on multi-pinacoidal plane fusion was proposed for medical volume data, achieving satisfactory progress on different backbone networks. The approach covers more information and shows compatibility while extracting global information between different input layers.
Background and Objective: High-dimensional data generally contains more accurate information for med-ical image, e.g., computerized tomography (CT) data can depict the three dimensional structure of organs more precisely. However, the data in high-dimension often needs enormous computation and has high memory requirements in the deep learning convolution networks, while dimensional reduction usually leads to performance degradation. Methods: In this paper, a two-dimensional deep learning segmentation network was proposed for medical volume data based on multi-pinacoidal plane fusion to cover more in-formation under the control of computation.This approach has conducive compatibility while using the model proposed to extract the global information between different inputs layers. Results: Our approach has worked in different backbone network. Using the approach, DeepUnet's Dice coefficient (Dice) and Positive Predictive Value (PPV) are 0.883 and 0.982 showing the satisfied progress. Various backbones can enjoy the profit of the method. Conclusions: Through the comparison of different backbones, it can be found that the proposed network with multi-pinacoidal plane fusion can achieve better results both quantitively and qualitatively. (c) 2021 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据