4.7 Article

Lumen segmentation using a Mask R-CNN in carotid arteries with stenotic atherosclerotic plaque

期刊

ULTRASONICS
卷 137, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.ultras.2023.107193

关键词

Ultrasound; Deep learning; Mask R-CNN

向作者/读者索取更多资源

In this study, a method for automatically determining bounding boxes and lumen segmentation using a Mask R-CNN network trained on sonographer assisted groundtruth carotid lumen segmentations is presented. This method is of great importance for analyzing ultrasound images that require time and labor, and it also lays the foundation for developing accurate plaque segmentation and wall thickness measurement methods.
In patients at high risk for ischemic stroke, clinical carotid ultrasound is often used to grade stenosis, determine plaque burden and assess stroke risk. Analysis currently requires a trained sonographer to manually identify vessel and plaque regions, which is time and labor intensive. We present a method for automatically determining bounding boxes and lumen segmentation using a Mask R-CNN network trained on sonographer assisted groundtruth carotid lumen segmentations. Automatic lumen segmentation also lays the groundwork for developing methods for accurate plaque segmentation, and wall thickness measurements in cases with no plaque. Different training schemes are used to identify the Mask R-CNN model with the highest accuracy. Utilizing a singlechannel B-mode training input, our model produces a mean bounding box intersection over union (IoU) of 0.81 and a mean lumen segmentation IoU of 0.75. However, we encountered errors in prediction when the jugular vein is the most prominently visualized vessel in the B-mode image. This was due to the fact that our dataset has limited instances of B-mode images with both the jugular vein and carotid artery where the vein is dominantly visualized. Additional training datasets are anticipated to mitigate this issue.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据